Nov 23 01:40:58 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Nov 23 01:40:58 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Nov 23 01:40:58 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 23 01:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Nov 23 01:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Nov 23 01:40:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Nov 23 01:40:58 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Nov 23 01:40:58 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Nov 23 01:40:58 localhost kernel: signal: max sigframe size: 1776 Nov 23 01:40:58 localhost kernel: BIOS-provided physical RAM map: Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Nov 23 01:40:58 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Nov 23 01:40:58 localhost kernel: NX (Execute Disable) protection: active Nov 23 01:40:58 localhost kernel: SMBIOS 2.8 present. Nov 23 01:40:58 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Nov 23 01:40:58 localhost kernel: Hypervisor detected: KVM Nov 23 01:40:58 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Nov 23 01:40:58 localhost kernel: kvm-clock: using sched offset of 2569527257 cycles Nov 23 01:40:58 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Nov 23 01:40:58 localhost kernel: tsc: Detected 2799.998 MHz processor Nov 23 01:40:58 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Nov 23 01:40:58 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Nov 23 01:40:58 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Nov 23 01:40:58 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Nov 23 01:40:58 localhost kernel: Using GB pages for direct mapping Nov 23 01:40:58 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Nov 23 01:40:58 localhost kernel: ACPI: Early table checksum verification disabled Nov 23 01:40:58 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Nov 23 01:40:58 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 23 01:40:58 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 23 01:40:58 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 23 01:40:58 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Nov 23 01:40:58 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 23 01:40:58 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 23 01:40:58 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Nov 23 01:40:58 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Nov 23 01:40:58 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Nov 23 01:40:58 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Nov 23 01:40:58 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Nov 23 01:40:58 localhost kernel: No NUMA configuration found Nov 23 01:40:58 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Nov 23 01:40:58 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Nov 23 01:40:58 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Nov 23 01:40:58 localhost kernel: Zone ranges: Nov 23 01:40:58 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Nov 23 01:40:58 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Nov 23 01:40:58 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Nov 23 01:40:58 localhost kernel: Device empty Nov 23 01:40:58 localhost kernel: Movable zone start for each node Nov 23 01:40:58 localhost kernel: Early memory node ranges Nov 23 01:40:58 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Nov 23 01:40:58 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Nov 23 01:40:58 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Nov 23 01:40:58 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Nov 23 01:40:58 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Nov 23 01:40:58 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Nov 23 01:40:58 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Nov 23 01:40:58 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Nov 23 01:40:58 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Nov 23 01:40:58 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Nov 23 01:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Nov 23 01:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Nov 23 01:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Nov 23 01:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Nov 23 01:40:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Nov 23 01:40:58 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Nov 23 01:40:58 localhost kernel: TSC deadline timer available Nov 23 01:40:58 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Nov 23 01:40:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Nov 23 01:40:58 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Nov 23 01:40:58 localhost kernel: Booting paravirtualized kernel on KVM Nov 23 01:40:58 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Nov 23 01:40:58 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Nov 23 01:40:58 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Nov 23 01:40:58 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Nov 23 01:40:58 localhost kernel: Fallback order for Node 0: 0 Nov 23 01:40:58 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Nov 23 01:40:58 localhost kernel: Policy zone: Normal Nov 23 01:40:58 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 23 01:40:58 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Nov 23 01:40:58 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Nov 23 01:40:58 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Nov 23 01:40:58 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Nov 23 01:40:58 localhost kernel: software IO TLB: area num 8. Nov 23 01:40:58 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Nov 23 01:40:58 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Nov 23 01:40:58 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Nov 23 01:40:58 localhost kernel: ftrace: allocating 44803 entries in 176 pages Nov 23 01:40:58 localhost kernel: ftrace: allocated 176 pages with 3 groups Nov 23 01:40:58 localhost kernel: Dynamic Preempt: voluntary Nov 23 01:40:58 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Nov 23 01:40:58 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Nov 23 01:40:58 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Nov 23 01:40:58 localhost kernel: #011Rude variant of Tasks RCU enabled. Nov 23 01:40:58 localhost kernel: #011Tracing variant of Tasks RCU enabled. Nov 23 01:40:58 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Nov 23 01:40:58 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Nov 23 01:40:58 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Nov 23 01:40:58 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Nov 23 01:40:58 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Nov 23 01:40:58 localhost kernel: random: crng init done (trusting CPU's manufacturer) Nov 23 01:40:58 localhost kernel: Console: colour VGA+ 80x25 Nov 23 01:40:58 localhost kernel: printk: console [tty0] enabled Nov 23 01:40:58 localhost kernel: printk: console [ttyS0] enabled Nov 23 01:40:58 localhost kernel: ACPI: Core revision 20211217 Nov 23 01:40:58 localhost kernel: APIC: Switch to symmetric I/O mode setup Nov 23 01:40:58 localhost kernel: x2apic enabled Nov 23 01:40:58 localhost kernel: Switched APIC routing to physical x2apic. Nov 23 01:40:58 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Nov 23 01:40:58 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Nov 23 01:40:58 localhost kernel: pid_max: default: 32768 minimum: 301 Nov 23 01:40:58 localhost kernel: LSM: Security Framework initializing Nov 23 01:40:58 localhost kernel: Yama: becoming mindful. Nov 23 01:40:58 localhost kernel: SELinux: Initializing. Nov 23 01:40:58 localhost kernel: LSM support for eBPF active Nov 23 01:40:58 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 23 01:40:58 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 23 01:40:58 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Nov 23 01:40:58 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Nov 23 01:40:58 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Nov 23 01:40:58 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Nov 23 01:40:58 localhost kernel: Spectre V2 : Mitigation: Retpolines Nov 23 01:40:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Nov 23 01:40:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Nov 23 01:40:58 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Nov 23 01:40:58 localhost kernel: RETBleed: Mitigation: untrained return thunk Nov 23 01:40:58 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Nov 23 01:40:58 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Nov 23 01:40:58 localhost kernel: Freeing SMP alternatives memory: 36K Nov 23 01:40:58 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Nov 23 01:40:58 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Nov 23 01:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 23 01:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 23 01:40:58 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 23 01:40:58 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Nov 23 01:40:58 localhost kernel: ... version: 0 Nov 23 01:40:58 localhost kernel: ... bit width: 48 Nov 23 01:40:58 localhost kernel: ... generic registers: 6 Nov 23 01:40:58 localhost kernel: ... value mask: 0000ffffffffffff Nov 23 01:40:58 localhost kernel: ... max period: 00007fffffffffff Nov 23 01:40:58 localhost kernel: ... fixed-purpose events: 0 Nov 23 01:40:58 localhost kernel: ... event mask: 000000000000003f Nov 23 01:40:58 localhost kernel: rcu: Hierarchical SRCU implementation. Nov 23 01:40:58 localhost kernel: rcu: #011Max phase no-delay instances is 400. Nov 23 01:40:58 localhost kernel: smp: Bringing up secondary CPUs ... Nov 23 01:40:58 localhost kernel: x86: Booting SMP configuration: Nov 23 01:40:58 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Nov 23 01:40:58 localhost kernel: smp: Brought up 1 node, 8 CPUs Nov 23 01:40:58 localhost kernel: smpboot: Max logical packages: 8 Nov 23 01:40:58 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Nov 23 01:40:58 localhost kernel: node 0 deferred pages initialised in 20ms Nov 23 01:40:58 localhost kernel: devtmpfs: initialized Nov 23 01:40:58 localhost kernel: x86/mm: Memory block size: 128MB Nov 23 01:40:58 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Nov 23 01:40:58 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Nov 23 01:40:58 localhost kernel: pinctrl core: initialized pinctrl subsystem Nov 23 01:40:58 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Nov 23 01:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Nov 23 01:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Nov 23 01:40:58 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Nov 23 01:40:58 localhost kernel: audit: initializing netlink subsys (disabled) Nov 23 01:40:58 localhost kernel: audit: type=2000 audit(1763880056.876:1): state=initialized audit_enabled=0 res=1 Nov 23 01:40:58 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Nov 23 01:40:58 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Nov 23 01:40:58 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Nov 23 01:40:58 localhost kernel: cpuidle: using governor menu Nov 23 01:40:58 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Nov 23 01:40:58 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Nov 23 01:40:58 localhost kernel: PCI: Using configuration type 1 for base access Nov 23 01:40:58 localhost kernel: PCI: Using configuration type 1 for extended access Nov 23 01:40:58 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Nov 23 01:40:58 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Nov 23 01:40:58 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Nov 23 01:40:58 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Nov 23 01:40:58 localhost kernel: cryptd: max_cpu_qlen set to 1000 Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Module Device) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Processor Device) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Nov 23 01:40:58 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Nov 23 01:40:58 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Nov 23 01:40:58 localhost kernel: ACPI: Interpreter enabled Nov 23 01:40:58 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Nov 23 01:40:58 localhost kernel: ACPI: Using IOAPIC for interrupt routing Nov 23 01:40:58 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Nov 23 01:40:58 localhost kernel: PCI: Using E820 reservations for host bridge windows Nov 23 01:40:58 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Nov 23 01:40:58 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Nov 23 01:40:58 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Nov 23 01:40:58 localhost kernel: acpiphp: Slot [3] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [4] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [5] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [6] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [7] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [8] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [9] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [10] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [11] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [12] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [13] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [14] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [15] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [16] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [17] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [18] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [19] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [20] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [21] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [22] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [23] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [24] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [25] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [26] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [27] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [28] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [29] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [30] registered Nov 23 01:40:58 localhost kernel: acpiphp: Slot [31] registered Nov 23 01:40:58 localhost kernel: PCI host bridge to bus 0000:00 Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Nov 23 01:40:58 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Nov 23 01:40:58 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Nov 23 01:40:58 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Nov 23 01:40:58 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Nov 23 01:40:58 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Nov 23 01:40:58 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Nov 23 01:40:58 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Nov 23 01:40:58 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Nov 23 01:40:58 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Nov 23 01:40:58 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Nov 23 01:40:58 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Nov 23 01:40:58 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Nov 23 01:40:58 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Nov 23 01:40:58 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Nov 23 01:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Nov 23 01:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Nov 23 01:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Nov 23 01:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Nov 23 01:40:58 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Nov 23 01:40:58 localhost kernel: iommu: Default domain type: Translated Nov 23 01:40:58 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Nov 23 01:40:58 localhost kernel: SCSI subsystem initialized Nov 23 01:40:58 localhost kernel: ACPI: bus type USB registered Nov 23 01:40:58 localhost kernel: usbcore: registered new interface driver usbfs Nov 23 01:40:58 localhost kernel: usbcore: registered new interface driver hub Nov 23 01:40:58 localhost kernel: usbcore: registered new device driver usb Nov 23 01:40:58 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Nov 23 01:40:58 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Nov 23 01:40:58 localhost kernel: PTP clock support registered Nov 23 01:40:58 localhost kernel: EDAC MC: Ver: 3.0.0 Nov 23 01:40:58 localhost kernel: NetLabel: Initializing Nov 23 01:40:58 localhost kernel: NetLabel: domain hash size = 128 Nov 23 01:40:58 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Nov 23 01:40:58 localhost kernel: NetLabel: unlabeled traffic allowed by default Nov 23 01:40:58 localhost kernel: PCI: Using ACPI for IRQ routing Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Nov 23 01:40:58 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Nov 23 01:40:58 localhost kernel: vgaarb: loaded Nov 23 01:40:58 localhost kernel: clocksource: Switched to clocksource kvm-clock Nov 23 01:40:58 localhost kernel: VFS: Disk quotas dquot_6.6.0 Nov 23 01:40:58 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Nov 23 01:40:58 localhost kernel: pnp: PnP ACPI init Nov 23 01:40:58 localhost kernel: pnp: PnP ACPI: found 5 devices Nov 23 01:40:58 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Nov 23 01:40:58 localhost kernel: NET: Registered PF_INET protocol family Nov 23 01:40:58 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Nov 23 01:40:58 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Nov 23 01:40:58 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Nov 23 01:40:58 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Nov 23 01:40:58 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Nov 23 01:40:58 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Nov 23 01:40:58 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Nov 23 01:40:58 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 23 01:40:58 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 23 01:40:58 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Nov 23 01:40:58 localhost kernel: NET: Registered PF_XDP protocol family Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Nov 23 01:40:58 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Nov 23 01:40:58 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Nov 23 01:40:58 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Nov 23 01:40:58 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Nov 23 01:40:58 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27411 usecs Nov 23 01:40:58 localhost kernel: PCI: CLS 0 bytes, default 64 Nov 23 01:40:58 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Nov 23 01:40:58 localhost kernel: Trying to unpack rootfs image as initramfs... Nov 23 01:40:58 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Nov 23 01:40:58 localhost kernel: ACPI: bus type thunderbolt registered Nov 23 01:40:58 localhost kernel: Initialise system trusted keyrings Nov 23 01:40:58 localhost kernel: Key type blacklist registered Nov 23 01:40:58 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Nov 23 01:40:58 localhost kernel: zbud: loaded Nov 23 01:40:58 localhost kernel: integrity: Platform Keyring initialized Nov 23 01:40:58 localhost kernel: NET: Registered PF_ALG protocol family Nov 23 01:40:58 localhost kernel: xor: automatically using best checksumming function avx Nov 23 01:40:58 localhost kernel: Key type asymmetric registered Nov 23 01:40:58 localhost kernel: Asymmetric key parser 'x509' registered Nov 23 01:40:58 localhost kernel: Running certificate verification selftests Nov 23 01:40:58 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Nov 23 01:40:58 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Nov 23 01:40:58 localhost kernel: io scheduler mq-deadline registered Nov 23 01:40:58 localhost kernel: io scheduler kyber registered Nov 23 01:40:58 localhost kernel: io scheduler bfq registered Nov 23 01:40:58 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Nov 23 01:40:58 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Nov 23 01:40:58 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Nov 23 01:40:58 localhost kernel: ACPI: button: Power Button [PWRF] Nov 23 01:40:58 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Nov 23 01:40:58 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Nov 23 01:40:58 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Nov 23 01:40:58 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Nov 23 01:40:58 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Nov 23 01:40:58 localhost kernel: Non-volatile memory driver v1.3 Nov 23 01:40:58 localhost kernel: rdac: device handler registered Nov 23 01:40:58 localhost kernel: hp_sw: device handler registered Nov 23 01:40:58 localhost kernel: emc: device handler registered Nov 23 01:40:58 localhost kernel: alua: device handler registered Nov 23 01:40:58 localhost kernel: libphy: Fixed MDIO Bus: probed Nov 23 01:40:58 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Nov 23 01:40:58 localhost kernel: ehci-pci: EHCI PCI platform driver Nov 23 01:40:58 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Nov 23 01:40:58 localhost kernel: ohci-pci: OHCI PCI platform driver Nov 23 01:40:58 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Nov 23 01:40:58 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Nov 23 01:40:58 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Nov 23 01:40:58 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Nov 23 01:40:58 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Nov 23 01:40:58 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Nov 23 01:40:58 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Nov 23 01:40:58 localhost kernel: usb usb1: Product: UHCI Host Controller Nov 23 01:40:58 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Nov 23 01:40:58 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Nov 23 01:40:58 localhost kernel: hub 1-0:1.0: USB hub found Nov 23 01:40:58 localhost kernel: hub 1-0:1.0: 2 ports detected Nov 23 01:40:58 localhost kernel: usbcore: registered new interface driver usbserial_generic Nov 23 01:40:58 localhost kernel: usbserial: USB Serial support registered for generic Nov 23 01:40:58 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Nov 23 01:40:58 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Nov 23 01:40:58 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Nov 23 01:40:58 localhost kernel: mousedev: PS/2 mouse device common for all mice Nov 23 01:40:58 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Nov 23 01:40:58 localhost kernel: rtc_cmos 00:04: registered as rtc0 Nov 23 01:40:58 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Nov 23 01:40:58 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-23T06:40:57 UTC (1763880057) Nov 23 01:40:58 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Nov 23 01:40:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Nov 23 01:40:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Nov 23 01:40:58 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Nov 23 01:40:58 localhost kernel: usbcore: registered new interface driver usbhid Nov 23 01:40:58 localhost kernel: usbhid: USB HID core driver Nov 23 01:40:58 localhost kernel: drop_monitor: Initializing network drop monitor service Nov 23 01:40:58 localhost kernel: Initializing XFRM netlink socket Nov 23 01:40:58 localhost kernel: NET: Registered PF_INET6 protocol family Nov 23 01:40:58 localhost kernel: Segment Routing with IPv6 Nov 23 01:40:58 localhost kernel: NET: Registered PF_PACKET protocol family Nov 23 01:40:58 localhost kernel: mpls_gso: MPLS GSO support Nov 23 01:40:58 localhost kernel: IPI shorthand broadcast: enabled Nov 23 01:40:58 localhost kernel: AVX2 version of gcm_enc/dec engaged. Nov 23 01:40:58 localhost kernel: AES CTR mode by8 optimization enabled Nov 23 01:40:58 localhost kernel: sched_clock: Marking stable (745683194, 175432733)->(1040568262, -119452335) Nov 23 01:40:58 localhost kernel: registered taskstats version 1 Nov 23 01:40:58 localhost kernel: Loading compiled-in X.509 certificates Nov 23 01:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 23 01:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Nov 23 01:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Nov 23 01:40:58 localhost kernel: zswap: loaded using pool lzo/zbud Nov 23 01:40:58 localhost kernel: page_owner is disabled Nov 23 01:40:58 localhost kernel: Key type big_key registered Nov 23 01:40:58 localhost kernel: Freeing initrd memory: 74232K Nov 23 01:40:58 localhost kernel: Key type encrypted registered Nov 23 01:40:58 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Nov 23 01:40:58 localhost kernel: Loading compiled-in module X.509 certificates Nov 23 01:40:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 23 01:40:58 localhost kernel: ima: Allocated hash algorithm: sha256 Nov 23 01:40:58 localhost kernel: ima: No architecture policies found Nov 23 01:40:58 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Nov 23 01:40:58 localhost kernel: evm: Initialising EVM extended attributes: Nov 23 01:40:58 localhost kernel: evm: security.selinux Nov 23 01:40:58 localhost kernel: evm: security.SMACK64 (disabled) Nov 23 01:40:58 localhost kernel: evm: security.SMACK64EXEC (disabled) Nov 23 01:40:58 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Nov 23 01:40:58 localhost kernel: evm: security.SMACK64MMAP (disabled) Nov 23 01:40:58 localhost kernel: evm: security.apparmor (disabled) Nov 23 01:40:58 localhost kernel: evm: security.ima Nov 23 01:40:58 localhost kernel: evm: security.capability Nov 23 01:40:58 localhost kernel: evm: HMAC attrs: 0x1 Nov 23 01:40:58 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Nov 23 01:40:58 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Nov 23 01:40:58 localhost kernel: usb 1-1: Product: QEMU USB Tablet Nov 23 01:40:58 localhost kernel: usb 1-1: Manufacturer: QEMU Nov 23 01:40:58 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Nov 23 01:40:58 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Nov 23 01:40:58 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Nov 23 01:40:58 localhost kernel: Freeing unused decrypted memory: 2036K Nov 23 01:40:58 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Nov 23 01:40:58 localhost kernel: Write protecting the kernel read-only data: 26624k Nov 23 01:40:58 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Nov 23 01:40:58 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Nov 23 01:40:58 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Nov 23 01:40:58 localhost kernel: Run /init as init process Nov 23 01:40:58 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 23 01:40:58 localhost systemd[1]: Detected virtualization kvm. Nov 23 01:40:58 localhost systemd[1]: Detected architecture x86-64. Nov 23 01:40:58 localhost systemd[1]: Running in initrd. Nov 23 01:40:58 localhost systemd[1]: No hostname configured, using default hostname. Nov 23 01:40:58 localhost systemd[1]: Hostname set to . Nov 23 01:40:58 localhost systemd[1]: Initializing machine ID from VM UUID. Nov 23 01:40:58 localhost systemd[1]: Queued start job for default target Initrd Default Target. Nov 23 01:40:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 23 01:40:58 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 23 01:40:58 localhost systemd[1]: Reached target Initrd /usr File System. Nov 23 01:40:58 localhost systemd[1]: Reached target Local File Systems. Nov 23 01:40:58 localhost systemd[1]: Reached target Path Units. Nov 23 01:40:58 localhost systemd[1]: Reached target Slice Units. Nov 23 01:40:58 localhost systemd[1]: Reached target Swaps. Nov 23 01:40:58 localhost systemd[1]: Reached target Timer Units. Nov 23 01:40:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 23 01:40:58 localhost systemd[1]: Listening on Journal Socket (/dev/log). Nov 23 01:40:58 localhost systemd[1]: Listening on Journal Socket. Nov 23 01:40:58 localhost systemd[1]: Listening on udev Control Socket. Nov 23 01:40:58 localhost systemd[1]: Listening on udev Kernel Socket. Nov 23 01:40:58 localhost systemd[1]: Reached target Socket Units. Nov 23 01:40:58 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 23 01:40:58 localhost systemd[1]: Starting Journal Service... Nov 23 01:40:58 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 01:40:58 localhost systemd[1]: Starting Create System Users... Nov 23 01:40:58 localhost systemd[1]: Starting Setup Virtual Console... Nov 23 01:40:58 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 23 01:40:58 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 01:40:58 localhost systemd-journald[284]: Journal started Nov 23 01:40:58 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/43895cafe6c247af84a56194e901da5c) is 8.0M, max 314.7M, 306.7M free. Nov 23 01:40:58 localhost systemd-modules-load[285]: Module 'msr' is built in Nov 23 01:40:58 localhost systemd[1]: Started Journal Service. Nov 23 01:40:58 localhost systemd[1]: Finished Setup Virtual Console. Nov 23 01:40:58 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Nov 23 01:40:58 localhost systemd[1]: Starting dracut cmdline hook... Nov 23 01:40:58 localhost systemd[1]: Starting Apply Kernel Variables... Nov 23 01:40:58 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997. Nov 23 01:40:58 localhost systemd-sysusers[286]: Creating group 'users' with GID 100. Nov 23 01:40:58 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81. Nov 23 01:40:58 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Nov 23 01:40:58 localhost systemd[1]: Finished Create System Users. Nov 23 01:40:58 localhost systemd[1]: Finished Apply Kernel Variables. Nov 23 01:40:58 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Nov 23 01:40:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 23 01:40:58 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 23 01:40:58 localhost dracut-cmdline[289]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 23 01:40:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 23 01:40:58 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 23 01:40:58 localhost systemd[1]: Finished dracut cmdline hook. Nov 23 01:40:58 localhost systemd[1]: Starting dracut pre-udev hook... Nov 23 01:40:58 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Nov 23 01:40:58 localhost kernel: device-mapper: uevent: version 1.0.3 Nov 23 01:40:58 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Nov 23 01:40:58 localhost kernel: RPC: Registered named UNIX socket transport module. Nov 23 01:40:58 localhost kernel: RPC: Registered udp transport module. Nov 23 01:40:58 localhost kernel: RPC: Registered tcp transport module. Nov 23 01:40:58 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Nov 23 01:40:58 localhost rpc.statd[408]: Version 2.5.4 starting Nov 23 01:40:58 localhost rpc.statd[408]: Initializing NSM state Nov 23 01:40:58 localhost rpc.idmapd[413]: Setting log level to 0 Nov 23 01:40:58 localhost systemd[1]: Finished dracut pre-udev hook. Nov 23 01:40:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 23 01:40:58 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'. Nov 23 01:40:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 23 01:40:58 localhost systemd[1]: Starting dracut pre-trigger hook... Nov 23 01:40:58 localhost systemd[1]: Finished dracut pre-trigger hook. Nov 23 01:40:58 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 23 01:40:58 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 23 01:40:58 localhost systemd[1]: Reached target System Initialization. Nov 23 01:40:58 localhost systemd[1]: Reached target Basic System. Nov 23 01:40:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 23 01:40:58 localhost systemd[1]: Reached target Network. Nov 23 01:40:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 23 01:40:58 localhost systemd[1]: Starting dracut initqueue hook... Nov 23 01:40:58 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Nov 23 01:40:58 localhost kernel: scsi host0: ata_piix Nov 23 01:40:58 localhost kernel: scsi host1: ata_piix Nov 23 01:40:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Nov 23 01:40:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Nov 23 01:40:58 localhost systemd-udevd[438]: Network interface NamePolicy= disabled on kernel command line. Nov 23 01:40:58 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Nov 23 01:40:58 localhost kernel: GPT:20971519 != 838860799 Nov 23 01:40:58 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Nov 23 01:40:58 localhost kernel: GPT:20971519 != 838860799 Nov 23 01:40:58 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Nov 23 01:40:58 localhost kernel: vda: vda1 vda2 vda3 vda4 Nov 23 01:40:59 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 23 01:40:59 localhost systemd[1]: Reached target Initrd Root Device. Nov 23 01:40:59 localhost kernel: ata1: found unknown device (class 0) Nov 23 01:40:59 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Nov 23 01:40:59 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Nov 23 01:40:59 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Nov 23 01:40:59 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Nov 23 01:40:59 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Nov 23 01:40:59 localhost systemd[1]: Finished dracut initqueue hook. Nov 23 01:40:59 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 23 01:40:59 localhost systemd[1]: Reached target Remote Encrypted Volumes. Nov 23 01:40:59 localhost systemd[1]: Reached target Remote File Systems. Nov 23 01:40:59 localhost systemd[1]: Starting dracut pre-mount hook... Nov 23 01:40:59 localhost systemd[1]: Finished dracut pre-mount hook. Nov 23 01:40:59 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Nov 23 01:40:59 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Nov 23 01:40:59 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 23 01:40:59 localhost systemd[1]: Mounting /sysroot... Nov 23 01:40:59 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Nov 23 01:40:59 localhost kernel: XFS (vda4): Mounting V5 Filesystem Nov 23 01:40:59 localhost kernel: XFS (vda4): Ending clean mount Nov 23 01:40:59 localhost systemd[1]: Mounted /sysroot. Nov 23 01:40:59 localhost systemd[1]: Reached target Initrd Root File System. Nov 23 01:40:59 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Nov 23 01:40:59 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Nov 23 01:40:59 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Nov 23 01:40:59 localhost systemd[1]: Reached target Initrd File Systems. Nov 23 01:40:59 localhost systemd[1]: Reached target Initrd Default Target. Nov 23 01:40:59 localhost systemd[1]: Starting dracut mount hook... Nov 23 01:40:59 localhost systemd[1]: Finished dracut mount hook. Nov 23 01:40:59 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Nov 23 01:40:59 localhost rpc.idmapd[413]: exiting on signal 15 Nov 23 01:40:59 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Nov 23 01:40:59 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Nov 23 01:40:59 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Nov 23 01:40:59 localhost systemd[1]: Stopped target Network. Nov 23 01:40:59 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Nov 23 01:40:59 localhost systemd[1]: Stopped target Timer Units. Nov 23 01:40:59 localhost systemd[1]: dbus.socket: Deactivated successfully. Nov 23 01:40:59 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Nov 23 01:40:59 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Nov 23 01:40:59 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Nov 23 01:40:59 localhost systemd[1]: Stopped target Initrd Default Target. Nov 23 01:40:59 localhost systemd[1]: Stopped target Basic System. Nov 23 01:41:00 localhost systemd[1]: Stopped target Initrd Root Device. Nov 23 01:41:00 localhost systemd[1]: Stopped target Initrd /usr File System. Nov 23 01:41:00 localhost systemd[1]: Stopped target Path Units. Nov 23 01:41:00 localhost systemd[1]: Stopped target Remote File Systems. Nov 23 01:41:00 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Nov 23 01:41:00 localhost systemd[1]: Stopped target Slice Units. Nov 23 01:41:00 localhost systemd[1]: Stopped target Socket Units. Nov 23 01:41:00 localhost systemd[1]: Stopped target System Initialization. Nov 23 01:41:00 localhost systemd[1]: Stopped target Local File Systems. Nov 23 01:41:00 localhost systemd[1]: Stopped target Swaps. Nov 23 01:41:00 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut mount hook. Nov 23 01:41:00 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut pre-mount hook. Nov 23 01:41:00 localhost systemd[1]: Stopped target Local Encrypted Volumes. Nov 23 01:41:00 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Nov 23 01:41:00 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut initqueue hook. Nov 23 01:41:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 23 01:41:00 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Load Kernel Modules. Nov 23 01:41:00 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Create Volatile Files and Directories. Nov 23 01:41:00 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Coldplug All udev Devices. Nov 23 01:41:00 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut pre-trigger hook. Nov 23 01:41:00 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 23 01:41:00 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Setup Virtual Console. Nov 23 01:41:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Nov 23 01:41:00 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 23 01:41:00 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Closed udev Control Socket. Nov 23 01:41:00 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Closed udev Kernel Socket. Nov 23 01:41:00 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut pre-udev hook. Nov 23 01:41:00 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped dracut cmdline hook. Nov 23 01:41:00 localhost systemd[1]: Starting Cleanup udev Database... Nov 23 01:41:00 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Nov 23 01:41:00 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Create List of Static Device Nodes. Nov 23 01:41:00 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Stopped Create System Users. Nov 23 01:41:00 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Nov 23 01:41:00 localhost systemd[1]: Finished Cleanup udev Database. Nov 23 01:41:00 localhost systemd[1]: Reached target Switch Root. Nov 23 01:41:00 localhost systemd[1]: Starting Switch Root... Nov 23 01:41:00 localhost systemd[1]: Switching root. Nov 23 01:41:00 localhost systemd-journald[284]: Journal stopped Nov 23 01:41:01 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd). Nov 23 01:41:01 localhost kernel: audit: type=1404 audit(1763880060.262:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 01:41:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 01:41:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 01:41:01 localhost kernel: audit: type=1403 audit(1763880060.365:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Nov 23 01:41:01 localhost systemd[1]: Successfully loaded SELinux policy in 107.169ms. Nov 23 01:41:01 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.238ms. Nov 23 01:41:01 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 23 01:41:01 localhost systemd[1]: Detected virtualization kvm. Nov 23 01:41:01 localhost systemd[1]: Detected architecture x86-64. Nov 23 01:41:01 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 01:41:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 01:41:01 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Stopped Switch Root. Nov 23 01:41:01 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Nov 23 01:41:01 localhost systemd[1]: Created slice Slice /system/getty. Nov 23 01:41:01 localhost systemd[1]: Created slice Slice /system/modprobe. Nov 23 01:41:01 localhost systemd[1]: Created slice Slice /system/serial-getty. Nov 23 01:41:01 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Nov 23 01:41:01 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Nov 23 01:41:01 localhost systemd[1]: Created slice User and Session Slice. Nov 23 01:41:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 23 01:41:01 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Nov 23 01:41:01 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Nov 23 01:41:01 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 23 01:41:01 localhost systemd[1]: Stopped target Switch Root. Nov 23 01:41:01 localhost systemd[1]: Stopped target Initrd File Systems. Nov 23 01:41:01 localhost systemd[1]: Stopped target Initrd Root File System. Nov 23 01:41:01 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Nov 23 01:41:01 localhost systemd[1]: Reached target Path Units. Nov 23 01:41:01 localhost systemd[1]: Reached target rpc_pipefs.target. Nov 23 01:41:01 localhost systemd[1]: Reached target Slice Units. Nov 23 01:41:01 localhost systemd[1]: Reached target Swaps. Nov 23 01:41:01 localhost systemd[1]: Reached target Local Verity Protected Volumes. Nov 23 01:41:01 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Nov 23 01:41:01 localhost systemd[1]: Reached target RPC Port Mapper. Nov 23 01:41:01 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 23 01:41:01 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Nov 23 01:41:01 localhost systemd[1]: Listening on udev Control Socket. Nov 23 01:41:01 localhost systemd[1]: Listening on udev Kernel Socket. Nov 23 01:41:01 localhost systemd[1]: Mounting Huge Pages File System... Nov 23 01:41:01 localhost systemd[1]: Mounting POSIX Message Queue File System... Nov 23 01:41:01 localhost systemd[1]: Mounting Kernel Debug File System... Nov 23 01:41:01 localhost systemd[1]: Mounting Kernel Trace File System... Nov 23 01:41:01 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 23 01:41:01 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 23 01:41:01 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 23 01:41:01 localhost systemd[1]: Starting Load Kernel Module drm... Nov 23 01:41:01 localhost systemd[1]: Starting Load Kernel Module fuse... Nov 23 01:41:01 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Nov 23 01:41:01 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Stopped File System Check on Root Device. Nov 23 01:41:01 localhost systemd[1]: Stopped Journal Service. Nov 23 01:41:01 localhost systemd[1]: Starting Journal Service... Nov 23 01:41:01 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 01:41:01 localhost systemd[1]: Starting Generate network units from Kernel command line... Nov 23 01:41:01 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Nov 23 01:41:01 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Nov 23 01:41:01 localhost kernel: fuse: init (API version 7.36) Nov 23 01:41:01 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 23 01:41:01 localhost systemd[1]: Mounted Huge Pages File System. Nov 23 01:41:01 localhost systemd[1]: Mounted POSIX Message Queue File System. Nov 23 01:41:01 localhost systemd-journald[618]: Journal started Nov 23 01:41:01 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 8.0M, max 314.7M, 306.7M free. Nov 23 01:41:01 localhost systemd[1]: Queued start job for default target Multi-User System. Nov 23 01:41:01 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd-modules-load[619]: Module 'msr' is built in Nov 23 01:41:01 localhost systemd[1]: Started Journal Service. Nov 23 01:41:01 localhost systemd[1]: Mounted Kernel Debug File System. Nov 23 01:41:01 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Nov 23 01:41:01 localhost systemd[1]: Mounted Kernel Trace File System. Nov 23 01:41:01 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 23 01:41:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 23 01:41:01 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Finished Load Kernel Module fuse. Nov 23 01:41:01 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Nov 23 01:41:01 localhost kernel: ACPI: bus type drm_connector registered Nov 23 01:41:01 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 01:41:01 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Finished Load Kernel Module drm. Nov 23 01:41:01 localhost systemd[1]: Finished Generate network units from Kernel command line. Nov 23 01:41:01 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Nov 23 01:41:01 localhost systemd[1]: Mounting FUSE Control File System... Nov 23 01:41:01 localhost systemd[1]: Mounting Kernel Configuration File System... Nov 23 01:41:01 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 23 01:41:01 localhost systemd[1]: Starting Rebuild Hardware Database... Nov 23 01:41:01 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Nov 23 01:41:01 localhost systemd[1]: Starting Load/Save Random Seed... Nov 23 01:41:01 localhost systemd[1]: Starting Apply Kernel Variables... Nov 23 01:41:01 localhost systemd[1]: Starting Create System Users... Nov 23 01:41:01 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 8.0M, max 314.7M, 306.7M free. Nov 23 01:41:01 localhost systemd-journald[618]: Received client request to flush runtime journal. Nov 23 01:41:01 localhost systemd[1]: Mounted FUSE Control File System. Nov 23 01:41:01 localhost systemd[1]: Mounted Kernel Configuration File System. Nov 23 01:41:01 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Nov 23 01:41:01 localhost systemd[1]: Finished Load/Save Random Seed. Nov 23 01:41:01 localhost systemd[1]: Finished Apply Kernel Variables. Nov 23 01:41:01 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 23 01:41:01 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 23 01:41:01 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989. Nov 23 01:41:01 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988. Nov 23 01:41:01 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Nov 23 01:41:01 localhost systemd[1]: Finished Create System Users. Nov 23 01:41:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 23 01:41:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 23 01:41:01 localhost systemd[1]: Reached target Preparation for Local File Systems. Nov 23 01:41:01 localhost systemd[1]: Set up automount EFI System Partition Automount. Nov 23 01:41:01 localhost systemd[1]: Finished Rebuild Hardware Database. Nov 23 01:41:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 23 01:41:01 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Nov 23 01:41:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 23 01:41:01 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 23 01:41:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 23 01:41:01 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 23 01:41:01 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Nov 23 01:41:01 localhost systemd-udevd[647]: Network interface NamePolicy= disabled on kernel command line. Nov 23 01:41:01 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Nov 23 01:41:01 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Nov 23 01:41:01 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Nov 23 01:41:01 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Nov 23 01:41:01 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Nov 23 01:41:01 localhost systemd-fsck[685]: fsck.fat 4.2 (2021-01-31) Nov 23 01:41:01 localhost systemd-fsck[685]: /dev/vda2: 12 files, 1782/51145 clusters Nov 23 01:41:01 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Nov 23 01:41:01 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Nov 23 01:41:01 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Nov 23 01:41:01 localhost kernel: Console: switching to colour dummy device 80x25 Nov 23 01:41:01 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Nov 23 01:41:01 localhost kernel: [drm] features: -context_init Nov 23 01:41:01 localhost kernel: [drm] number of scanouts: 1 Nov 23 01:41:01 localhost kernel: [drm] number of cap sets: 0 Nov 23 01:41:01 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Nov 23 01:41:01 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Nov 23 01:41:01 localhost kernel: Console: switching to colour frame buffer device 128x48 Nov 23 01:41:01 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Nov 23 01:41:01 localhost kernel: SVM: TSC scaling supported Nov 23 01:41:01 localhost kernel: kvm: Nested Virtualization enabled Nov 23 01:41:01 localhost kernel: SVM: kvm: Nested Paging enabled Nov 23 01:41:01 localhost kernel: SVM: LBR virtualization supported Nov 23 01:41:02 localhost systemd[1]: Mounting /boot... Nov 23 01:41:02 localhost kernel: XFS (vda3): Mounting V5 Filesystem Nov 23 01:41:02 localhost kernel: XFS (vda3): Ending clean mount Nov 23 01:41:02 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Nov 23 01:41:02 localhost systemd[1]: Mounted /boot. Nov 23 01:41:02 localhost systemd[1]: Mounting /boot/efi... Nov 23 01:41:02 localhost systemd[1]: Mounted /boot/efi. Nov 23 01:41:02 localhost systemd[1]: Reached target Local File Systems. Nov 23 01:41:02 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Nov 23 01:41:02 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Nov 23 01:41:02 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 23 01:41:02 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 23 01:41:02 localhost systemd[1]: Starting Automatic Boot Loader Update... Nov 23 01:41:02 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Nov 23 01:41:02 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 23 01:41:02 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 718 (bootctl) Nov 23 01:41:02 localhost systemd[1]: Starting File System Check on /dev/vda2... Nov 23 01:41:02 localhost systemd[1]: Finished File System Check on /dev/vda2. Nov 23 01:41:02 localhost systemd[1]: Mounting EFI System Partition Automount... Nov 23 01:41:02 localhost systemd[1]: Mounted EFI System Partition Automount. Nov 23 01:41:02 localhost systemd[1]: Finished Automatic Boot Loader Update. Nov 23 01:41:02 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 23 01:41:02 localhost systemd[1]: Starting Security Auditing Service... Nov 23 01:41:02 localhost systemd[1]: Starting RPC Bind... Nov 23 01:41:02 localhost systemd[1]: Starting Rebuild Journal Catalog... Nov 23 01:41:02 localhost auditd[727]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Nov 23 01:41:02 localhost auditd[727]: Init complete, auditd 3.0.7 listening for events (startup state enable) Nov 23 01:41:02 localhost systemd[1]: Finished Rebuild Journal Catalog. Nov 23 01:41:02 localhost systemd[1]: Started RPC Bind. Nov 23 01:41:02 localhost augenrules[732]: /sbin/augenrules: No change Nov 23 01:41:02 localhost augenrules[742]: No rules Nov 23 01:41:02 localhost augenrules[742]: enabled 1 Nov 23 01:41:02 localhost augenrules[742]: failure 1 Nov 23 01:41:02 localhost augenrules[742]: pid 727 Nov 23 01:41:02 localhost augenrules[742]: rate_limit 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_limit 8192 Nov 23 01:41:02 localhost augenrules[742]: lost 0 Nov 23 01:41:02 localhost augenrules[742]: backlog 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time 60000 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 23 01:41:02 localhost augenrules[742]: enabled 1 Nov 23 01:41:02 localhost augenrules[742]: failure 1 Nov 23 01:41:02 localhost augenrules[742]: pid 727 Nov 23 01:41:02 localhost augenrules[742]: rate_limit 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_limit 8192 Nov 23 01:41:02 localhost augenrules[742]: lost 0 Nov 23 01:41:02 localhost augenrules[742]: backlog 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time 60000 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 23 01:41:02 localhost augenrules[742]: enabled 1 Nov 23 01:41:02 localhost augenrules[742]: failure 1 Nov 23 01:41:02 localhost augenrules[742]: pid 727 Nov 23 01:41:02 localhost augenrules[742]: rate_limit 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_limit 8192 Nov 23 01:41:02 localhost augenrules[742]: lost 0 Nov 23 01:41:02 localhost augenrules[742]: backlog 0 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time 60000 Nov 23 01:41:02 localhost augenrules[742]: backlog_wait_time_actual 0 Nov 23 01:41:02 localhost systemd[1]: Started Security Auditing Service. Nov 23 01:41:02 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Nov 23 01:41:02 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Nov 23 01:41:02 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Nov 23 01:41:02 localhost systemd[1]: Starting Update is Completed... Nov 23 01:41:02 localhost systemd[1]: Finished Update is Completed. Nov 23 01:41:02 localhost systemd[1]: Reached target System Initialization. Nov 23 01:41:02 localhost systemd[1]: Started dnf makecache --timer. Nov 23 01:41:02 localhost systemd[1]: Started Daily rotation of log files. Nov 23 01:41:02 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Nov 23 01:41:02 localhost systemd[1]: Reached target Timer Units. Nov 23 01:41:02 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 23 01:41:02 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Nov 23 01:41:02 localhost systemd[1]: Reached target Socket Units. Nov 23 01:41:02 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Nov 23 01:41:02 localhost systemd[1]: Starting D-Bus System Message Bus... Nov 23 01:41:02 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 23 01:41:02 localhost systemd[1]: Started D-Bus System Message Bus. Nov 23 01:41:02 localhost systemd[1]: Reached target Basic System. Nov 23 01:41:02 localhost systemd[1]: Starting NTP client/server... Nov 23 01:41:02 localhost journal[752]: Ready Nov 23 01:41:02 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Nov 23 01:41:02 localhost systemd[1]: Started irqbalance daemon. Nov 23 01:41:02 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Nov 23 01:41:02 localhost systemd[1]: Starting System Logging Service... Nov 23 01:41:02 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 01:41:02 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 01:41:02 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 01:41:02 localhost systemd[1]: Reached target sshd-keygen.target. Nov 23 01:41:02 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Nov 23 01:41:02 localhost systemd[1]: Reached target User and Group Name Lookups. Nov 23 01:41:02 localhost systemd[1]: Starting User Login Management... Nov 23 01:41:02 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start Nov 23 01:41:02 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Nov 23 01:41:02 localhost systemd[1]: Started System Logging Service. Nov 23 01:41:02 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Nov 23 01:41:02 localhost chronyd[767]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 23 01:41:02 localhost chronyd[767]: Using right/UTC timezone to obtain leap second data Nov 23 01:41:02 localhost chronyd[767]: Loaded seccomp filter (level 2) Nov 23 01:41:02 localhost systemd[1]: Started NTP client/server. Nov 23 01:41:02 localhost systemd-logind[761]: New seat seat0. Nov 23 01:41:02 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button) Nov 23 01:41:02 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 23 01:41:02 localhost systemd[1]: Started User Login Management. Nov 23 01:41:02 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 01:41:03 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sun, 23 Nov 2025 06:41:03 +0000. Up 6.37 seconds. Nov 23 01:41:03 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpqqzl606r.mount: Deactivated successfully. Nov 23 01:41:03 localhost systemd[1]: Starting Hostname Service... Nov 23 01:41:03 localhost systemd[1]: Started Hostname Service. Nov 23 01:41:03 localhost systemd-hostnamed[785]: Hostname set to (static) Nov 23 01:41:03 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Nov 23 01:41:03 localhost systemd[1]: Reached target Preparation for Network. Nov 23 01:41:03 localhost systemd[1]: Starting Network Manager... Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7770] NetworkManager (version 1.42.2-1.el9) is starting... (boot:2e694857-c83c-42a3-a300-fcad2ba2b06e) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7775] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 23 01:41:03 localhost systemd[1]: Started Network Manager. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7823] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 23 01:41:03 localhost systemd[1]: Reached target Network. Nov 23 01:41:03 localhost systemd[1]: Starting Network Manager Wait Online... Nov 23 01:41:03 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7906] manager[0x5622fc1ef020]: monitoring kernel firmware directory '/lib/firmware'. Nov 23 01:41:03 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7977] hostname: hostname: using hostnamed Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7978] hostname: static hostname changed from (none) to "np0005532585.novalocal" Nov 23 01:41:03 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.7994] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 23 01:41:03 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Nov 23 01:41:03 localhost systemd[1]: Started GSSAPI Proxy Daemon. Nov 23 01:41:03 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 23 01:41:03 localhost systemd[1]: Reached target NFS client services. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8123] manager[0x5622fc1ef020]: rfkill: Wi-Fi hardware radio set enabled Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8124] manager[0x5622fc1ef020]: rfkill: WWAN hardware radio set enabled Nov 23 01:41:03 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 23 01:41:03 localhost systemd[1]: Reached target Remote File Systems. Nov 23 01:41:03 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8212] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8213] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8222] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8223] manager: Networking is enabled by state file Nov 23 01:41:03 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8270] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8271] settings: Loaded settings plugin: keyfile (internal) Nov 23 01:41:03 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8329] dhcp: init: Using DHCP client 'internal' Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8335] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8373] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8389] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8405] device (lo): Activation: starting connection 'lo' (75f95e5d-82c2-442b-91f8-cfa8260985bf) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8422] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8434] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 23 01:41:03 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8500] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8519] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8520] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8522] device (eth0): carrier: link connected Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8539] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8542] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8547] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8552] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8553] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8555] manager: NetworkManager state is now CONNECTING Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8556] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8561] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8564] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8566] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8571] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8575] device (lo): Activation: successful, device activated. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8616] dhcp4 (eth0): state changed new lease, address=38.102.83.198 Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8619] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8637] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8650] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8651] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8654] manager: NetworkManager state is now CONNECTED_SITE Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8657] device (eth0): Activation: successful, device activated. Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8660] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 23 01:41:03 localhost NetworkManager[790]: [1763880063.8664] manager: startup complete Nov 23 01:41:03 localhost systemd[1]: Finished Network Manager Wait Online. Nov 23 01:41:03 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Nov 23 01:41:04 localhost cloud-init[944]: Cloud-init v. 22.1-9.el9 running 'init' at Sun, 23 Nov 2025 06:41:04 +0000. Up 7.28 seconds. Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | eth0 | True | 38.102.83.198 | 255.255.255.0 | global | fa:16:3e:72:a3:51 | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | eth0 | True | fe80::f816:3eff:fe72:a351/64 | . | link | fa:16:3e:72:a3:51 | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | lo | True | ::1/128 | . | host | . | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | Route | Destination | Gateway | Interface | Flags | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: | 3 | multicast | :: | eth0 | U | Nov 23 01:41:04 localhost cloud-init[944]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 23 01:41:04 localhost systemd[1]: Starting Authorization Manager... Nov 23 01:41:04 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 23 01:41:04 localhost polkitd[1037]: Started polkitd version 0.117 Nov 23 01:41:04 localhost systemd[1]: Started Authorization Manager. Nov 23 01:41:08 localhost cloud-init[944]: Generating public/private rsa key pair. Nov 23 01:41:08 localhost cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Nov 23 01:41:08 localhost cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Nov 23 01:41:08 localhost cloud-init[944]: The key fingerprint is: Nov 23 01:41:08 localhost cloud-init[944]: SHA256:ZQrZ/pYycRFaGUbuepttPoytYaqMQuJ642qdOqq+xXI root@np0005532585.novalocal Nov 23 01:41:08 localhost cloud-init[944]: The key's randomart image is: Nov 23 01:41:08 localhost cloud-init[944]: +---[RSA 3072]----+ Nov 23 01:41:08 localhost cloud-init[944]: | .*o | Nov 23 01:41:08 localhost cloud-init[944]: | o =.. | Nov 23 01:41:08 localhost cloud-init[944]: | o o = | Nov 23 01:41:08 localhost cloud-init[944]: | o = . | Nov 23 01:41:08 localhost cloud-init[944]: | S o | Nov 23 01:41:08 localhost cloud-init[944]: |. o = . | Nov 23 01:41:08 localhost cloud-init[944]: |.+.E. + B+ | Nov 23 01:41:08 localhost cloud-init[944]: | +Bo o B.== | Nov 23 01:41:08 localhost cloud-init[944]: |@B=o. o.. ++o. | Nov 23 01:41:08 localhost cloud-init[944]: +----[SHA256]-----+ Nov 23 01:41:08 localhost cloud-init[944]: Generating public/private ecdsa key pair. Nov 23 01:41:08 localhost cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Nov 23 01:41:08 localhost cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Nov 23 01:41:08 localhost cloud-init[944]: The key fingerprint is: Nov 23 01:41:08 localhost cloud-init[944]: SHA256:uLBD0n17Ltuhg1Ary+3sqo/DK6blrJs9Mbvc6BVR/xw root@np0005532585.novalocal Nov 23 01:41:08 localhost cloud-init[944]: The key's randomart image is: Nov 23 01:41:08 localhost cloud-init[944]: +---[ECDSA 256]---+ Nov 23 01:41:08 localhost cloud-init[944]: | . | Nov 23 01:41:08 localhost cloud-init[944]: | . . | Nov 23 01:41:08 localhost cloud-init[944]: | . . E | Nov 23 01:41:08 localhost cloud-init[944]: | . o.. o . | Nov 23 01:41:08 localhost cloud-init[944]: | . =.o.S o | Nov 23 01:41:08 localhost cloud-init[944]: | ooo+.o . | Nov 23 01:41:08 localhost cloud-init[944]: | ..=+=... o | Nov 23 01:41:08 localhost cloud-init[944]: |.O==ooo o= . | Nov 23 01:41:08 localhost cloud-init[944]: |B=%B+++ o+o | Nov 23 01:41:08 localhost cloud-init[944]: +----[SHA256]-----+ Nov 23 01:41:08 localhost cloud-init[944]: Generating public/private ed25519 key pair. Nov 23 01:41:08 localhost cloud-init[944]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Nov 23 01:41:08 localhost cloud-init[944]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Nov 23 01:41:08 localhost cloud-init[944]: The key fingerprint is: Nov 23 01:41:08 localhost cloud-init[944]: SHA256:tYIAbgiV17JZ+ywyFV1S7S0smdEHVQupK5wep/i3maw root@np0005532585.novalocal Nov 23 01:41:08 localhost cloud-init[944]: The key's randomart image is: Nov 23 01:41:08 localhost cloud-init[944]: +--[ED25519 256]--+ Nov 23 01:41:08 localhost cloud-init[944]: |..o. . ..ooo.+o..| Nov 23 01:41:08 localhost cloud-init[944]: |.o..o + ... o.o .| Nov 23 01:41:08 localhost cloud-init[944]: |. o..= o .*.o . | Nov 23 01:41:08 localhost cloud-init[944]: | . o.o. .+o+ . | Nov 23 01:41:08 localhost cloud-init[944]: | ..oS o... | Nov 23 01:41:08 localhost cloud-init[944]: | o . o* o | Nov 23 01:41:08 localhost cloud-init[944]: | o .o = | Nov 23 01:41:08 localhost cloud-init[944]: | . o..o | Nov 23 01:41:08 localhost cloud-init[944]: | .Eo=. | Nov 23 01:41:08 localhost cloud-init[944]: +----[SHA256]-----+ Nov 23 01:41:08 localhost sm-notify[1133]: Version 2.5.4 starting Nov 23 01:41:08 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Nov 23 01:41:08 localhost systemd[1]: Reached target Cloud-config availability. Nov 23 01:41:08 localhost systemd[1]: Reached target Network is Online. Nov 23 01:41:08 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Nov 23 01:41:08 localhost sshd[1134]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Nov 23 01:41:08 localhost systemd[1]: Starting Crash recovery kernel arming... Nov 23 01:41:08 localhost systemd[1]: Starting Notify NFS peers of a restart... Nov 23 01:41:08 localhost sshd[1145]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost systemd[1]: Starting OpenSSH server daemon... Nov 23 01:41:08 localhost sshd[1156]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost systemd[1]: Starting Permit User Sessions... Nov 23 01:41:08 localhost systemd[1]: Started Notify NFS peers of a restart. Nov 23 01:41:08 localhost sshd[1167]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost systemd[1]: Finished Permit User Sessions. Nov 23 01:41:08 localhost systemd[1]: Started Command Scheduler. Nov 23 01:41:08 localhost systemd[1]: Started Getty on tty1. Nov 23 01:41:08 localhost systemd[1]: Started Serial Getty on ttyS0. Nov 23 01:41:08 localhost systemd[1]: Reached target Login Prompts. Nov 23 01:41:08 localhost systemd[1]: Started OpenSSH server daemon. Nov 23 01:41:08 localhost systemd[1]: Reached target Multi-User System. Nov 23 01:41:08 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Nov 23 01:41:08 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Nov 23 01:41:08 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Nov 23 01:41:08 localhost sshd[1174]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost sshd[1182]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost kdumpctl[1137]: kdump: No kdump initial ramdisk found. Nov 23 01:41:08 localhost kdumpctl[1137]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Nov 23 01:41:08 localhost sshd[1199]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost sshd[1212]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost cloud-init[1263]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sun, 23 Nov 2025 06:41:08 +0000. Up 11.65 seconds. Nov 23 01:41:08 localhost sshd[1262]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost sshd[1279]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:08 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Nov 23 01:41:08 localhost systemd[1]: Starting Execute cloud user/final scripts... Nov 23 01:41:08 localhost chronyd[767]: Selected source 167.160.187.12 (2.rhel.pool.ntp.org) Nov 23 01:41:08 localhost chronyd[767]: System clock TAI offset set to 37 seconds Nov 23 01:41:08 localhost dracut[1436]: dracut-057-21.git20230214.el9 Nov 23 01:41:08 localhost cloud-init[1456]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sun, 23 Nov 2025 06:41:08 +0000. Up 12.05 seconds. Nov 23 01:41:08 localhost dracut[1438]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Nov 23 01:41:09 localhost cloud-init[1513]: ############################################################# Nov 23 01:41:09 localhost cloud-init[1518]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Nov 23 01:41:09 localhost cloud-init[1532]: 256 SHA256:uLBD0n17Ltuhg1Ary+3sqo/DK6blrJs9Mbvc6BVR/xw root@np0005532585.novalocal (ECDSA) Nov 23 01:41:09 localhost cloud-init[1540]: 256 SHA256:tYIAbgiV17JZ+ywyFV1S7S0smdEHVQupK5wep/i3maw root@np0005532585.novalocal (ED25519) Nov 23 01:41:09 localhost cloud-init[1546]: 3072 SHA256:ZQrZ/pYycRFaGUbuepttPoytYaqMQuJ642qdOqq+xXI root@np0005532585.novalocal (RSA) Nov 23 01:41:09 localhost cloud-init[1549]: -----END SSH HOST KEY FINGERPRINTS----- Nov 23 01:41:09 localhost cloud-init[1552]: ############################################################# Nov 23 01:41:09 localhost cloud-init[1456]: Cloud-init v. 22.1-9.el9 finished at Sun, 23 Nov 2025 06:41:09 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 12.30 seconds Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 23 01:41:09 localhost systemd[1]: Reloading Network Manager... Nov 23 01:41:09 localhost dracut[1438]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 23 01:41:09 localhost NetworkManager[790]: [1763880069.2572] audit: op="reload" arg="0" pid=1629 uid=0 result="success" Nov 23 01:41:09 localhost NetworkManager[790]: [1763880069.2580] config: signal: SIGHUP (no changes from disk) Nov 23 01:41:09 localhost dracut[1438]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 23 01:41:09 localhost systemd[1]: Reloaded Network Manager. Nov 23 01:41:09 localhost systemd[1]: Finished Execute cloud user/final scripts. Nov 23 01:41:09 localhost systemd[1]: Reached target Cloud-init target. Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 23 01:41:09 localhost dracut[1438]: memstrack is not available Nov 23 01:41:09 localhost dracut[1438]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 23 01:41:09 localhost dracut[1438]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 23 01:41:09 localhost dracut[1438]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 23 01:41:09 localhost dracut[1438]: memstrack is not available Nov 23 01:41:09 localhost dracut[1438]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 23 01:41:10 localhost dracut[1438]: *** Including module: systemd *** Nov 23 01:41:10 localhost dracut[1438]: *** Including module: systemd-initrd *** Nov 23 01:41:10 localhost dracut[1438]: *** Including module: i18n *** Nov 23 01:41:10 localhost dracut[1438]: No KEYMAP configured. Nov 23 01:41:10 localhost dracut[1438]: *** Including module: drm *** Nov 23 01:41:10 localhost dracut[1438]: *** Including module: prefixdevname *** Nov 23 01:41:10 localhost dracut[1438]: *** Including module: kernel-modules *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: kernel-modules-extra *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: qemu *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: fstab-sys *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: rootfs-block *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: terminfo *** Nov 23 01:41:11 localhost dracut[1438]: *** Including module: udev-rules *** Nov 23 01:41:12 localhost dracut[1438]: Skipping udev rule: 91-permissions.rules Nov 23 01:41:12 localhost dracut[1438]: Skipping udev rule: 80-drivers-modprobe.rules Nov 23 01:41:12 localhost dracut[1438]: *** Including module: virtiofs *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: dracut-systemd *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: usrmount *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: base *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: fs-lib *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: kdumpbase *** Nov 23 01:41:12 localhost dracut[1438]: *** Including module: microcode_ctl-fw_dir_override *** Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl module: mangling fw_dir Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-2d-07" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-4e-03" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-4f-01" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-55-04" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-5e-03" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-8c-01" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Nov 23 01:41:12 localhost dracut[1438]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Nov 23 01:41:13 localhost dracut[1438]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Nov 23 01:41:13 localhost dracut[1438]: *** Including module: shutdown *** Nov 23 01:41:13 localhost dracut[1438]: *** Including module: squash *** Nov 23 01:41:13 localhost dracut[1438]: *** Including modules done *** Nov 23 01:41:13 localhost dracut[1438]: *** Installing kernel module dependencies *** Nov 23 01:41:13 localhost dracut[1438]: *** Installing kernel module dependencies done *** Nov 23 01:41:13 localhost dracut[1438]: *** Resolving executable dependencies *** Nov 23 01:41:13 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 23 01:41:14 localhost dracut[1438]: *** Resolving executable dependencies done *** Nov 23 01:41:14 localhost dracut[1438]: *** Hardlinking files *** Nov 23 01:41:15 localhost dracut[1438]: Mode: real Nov 23 01:41:15 localhost dracut[1438]: Files: 1099 Nov 23 01:41:15 localhost dracut[1438]: Linked: 3 files Nov 23 01:41:15 localhost dracut[1438]: Compared: 0 xattrs Nov 23 01:41:15 localhost dracut[1438]: Compared: 373 files Nov 23 01:41:15 localhost dracut[1438]: Saved: 61.04 KiB Nov 23 01:41:15 localhost dracut[1438]: Duration: 0.048811 seconds Nov 23 01:41:15 localhost dracut[1438]: *** Hardlinking files done *** Nov 23 01:41:15 localhost dracut[1438]: Could not find 'strip'. Not stripping the initramfs. Nov 23 01:41:15 localhost dracut[1438]: *** Generating early-microcode cpio image *** Nov 23 01:41:15 localhost dracut[1438]: *** Constructing AuthenticAMD.bin *** Nov 23 01:41:15 localhost dracut[1438]: *** Store current command line parameters *** Nov 23 01:41:15 localhost dracut[1438]: Stored kernel commandline: Nov 23 01:41:15 localhost dracut[1438]: No dracut internal kernel commandline stored in the initramfs Nov 23 01:41:15 localhost dracut[1438]: *** Install squash loader *** Nov 23 01:41:15 localhost dracut[1438]: *** Squashing the files inside the initramfs *** Nov 23 01:41:17 localhost dracut[1438]: *** Squashing the files inside the initramfs done *** Nov 23 01:41:17 localhost dracut[1438]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Nov 23 01:41:17 localhost dracut[1438]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Nov 23 01:41:17 localhost kdumpctl[1137]: kdump: kexec: loaded kdump kernel Nov 23 01:41:17 localhost kdumpctl[1137]: kdump: Starting kdump: [OK] Nov 23 01:41:17 localhost systemd[1]: Finished Crash recovery kernel arming. Nov 23 01:41:17 localhost systemd[1]: Startup finished in 1.232s (kernel) + 2.196s (initrd) + 17.645s (userspace) = 21.075s. Nov 23 01:41:33 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 23 01:41:43 localhost sshd[4177]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:41:43 localhost systemd[1]: Created slice User Slice of UID 1000. Nov 23 01:41:43 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Nov 23 01:41:43 localhost systemd-logind[761]: New session 1 of user zuul. Nov 23 01:41:43 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Nov 23 01:41:43 localhost systemd[1]: Starting User Manager for UID 1000... Nov 23 01:41:43 localhost systemd[4181]: Queued start job for default target Main User Target. Nov 23 01:41:43 localhost systemd[4181]: Created slice User Application Slice. Nov 23 01:41:43 localhost systemd[4181]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 01:41:43 localhost systemd[4181]: Started Daily Cleanup of User's Temporary Directories. Nov 23 01:41:43 localhost systemd[4181]: Reached target Paths. Nov 23 01:41:43 localhost systemd[4181]: Reached target Timers. Nov 23 01:41:43 localhost systemd[4181]: Starting D-Bus User Message Bus Socket... Nov 23 01:41:43 localhost systemd[4181]: Starting Create User's Volatile Files and Directories... Nov 23 01:41:43 localhost systemd[4181]: Finished Create User's Volatile Files and Directories. Nov 23 01:41:43 localhost systemd[4181]: Listening on D-Bus User Message Bus Socket. Nov 23 01:41:43 localhost systemd[4181]: Reached target Sockets. Nov 23 01:41:43 localhost systemd[4181]: Reached target Basic System. Nov 23 01:41:43 localhost systemd[4181]: Reached target Main User Target. Nov 23 01:41:43 localhost systemd[4181]: Startup finished in 135ms. Nov 23 01:41:43 localhost systemd[1]: Started User Manager for UID 1000. Nov 23 01:41:43 localhost systemd[1]: Started Session 1 of User zuul. Nov 23 01:41:43 localhost python3[4233]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 01:41:53 localhost python3[4251]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 01:42:00 localhost python3[4304]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 01:42:01 localhost python3[4334]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Nov 23 01:42:04 localhost python3[4350]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:04 localhost python3[4364]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:06 localhost python3[4423]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:06 localhost python3[4464]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880125.9268944-391-182221978638100/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa follow=False checksum=4877a9422cfa308f85d093f4f170aa8e2f5129bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:07 localhost python3[4537]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:08 localhost python3[4578]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880127.6385386-491-67572255985701/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa.pub follow=False checksum=9e6358c9dcdfe108c4f779a3b698bd3c9d97da46 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:10 localhost python3[4606]: ansible-ping Invoked with data=pong Nov 23 01:42:12 localhost python3[4620]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 01:42:15 localhost python3[4673]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Nov 23 01:42:18 localhost python3[4695]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:18 localhost python3[4709]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:18 localhost python3[4723]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:20 localhost python3[4737]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:20 localhost python3[4751]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:20 localhost python3[4765]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:23 localhost python3[4782]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:24 localhost python3[4830]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:25 localhost python3[4873]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880144.6079686-102-52469507161254/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:32 localhost python3[4901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:33 localhost python3[4915]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:33 localhost python3[4929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:33 localhost python3[4943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:33 localhost python3[4957]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:34 localhost python3[4971]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:34 localhost python3[4985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:34 localhost python3[4999]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:34 localhost python3[5013]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:35 localhost python3[5027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:35 localhost python3[5041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:35 localhost python3[5055]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:36 localhost python3[5069]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:36 localhost python3[5083]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:36 localhost python3[5097]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:36 localhost python3[5111]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:37 localhost python3[5125]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:37 localhost python3[5139]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:37 localhost python3[5153]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:37 localhost python3[5167]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:38 localhost python3[5181]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:38 localhost python3[5195]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:38 localhost python3[5209]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:39 localhost python3[5223]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:39 localhost python3[5237]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:39 localhost python3[5251]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:42:40 localhost python3[5267]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 23 01:42:40 localhost systemd[1]: Starting Time & Date Service... Nov 23 01:42:41 localhost systemd[1]: Started Time & Date Service. Nov 23 01:42:41 localhost systemd-timedated[5269]: Changed time zone to 'UTC' (UTC). Nov 23 01:42:42 localhost python3[5288]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:43 localhost python3[5334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:43 localhost python3[5375]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763880163.269609-494-154951226261599/source _original_basename=tmp0pv_ioa6 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:45 localhost python3[5435]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:45 localhost python3[5476]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763880164.7656982-583-138643582818316/source _original_basename=tmp1tdcb_wm follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:46 localhost sshd[5518]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:42:47 localhost python3[5539]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:47 localhost python3[5583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763880166.8770776-728-49137748679588/source _original_basename=tmpdm6587bx follow=False checksum=fd315655f47fc5fc6a83eb387067284e4325b6ee backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:48 localhost python3[5611]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:42:48 localhost python3[5627]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:42:50 localhost python3[5677]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:42:50 localhost python3[5720]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880169.7945817-853-200842931884732/source _original_basename=tmpolctoykt follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:42:51 localhost python3[5751]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-656c-65aa-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:42:52 localhost python3[5769]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-656c-65aa-000000000024-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Nov 23 01:42:54 localhost python3[5787]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:43:11 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 23 01:43:14 localhost python3[5806]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:44:14 localhost systemd-logind[761]: Session 1 logged out. Waiting for processes to exit. Nov 23 01:44:36 localhost systemd[4181]: Starting Mark boot as successful... Nov 23 01:44:36 localhost systemd[4181]: Finished Mark boot as successful. Nov 23 01:45:03 localhost systemd[1]: Unmounting EFI System Partition Automount... Nov 23 01:45:03 localhost systemd[1]: efi.mount: Deactivated successfully. Nov 23 01:45:03 localhost systemd[1]: Unmounted EFI System Partition Automount. Nov 23 01:46:13 localhost sshd[5812]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:46:13 localhost sshd[5813]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:46:42 localhost sshd[5817]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Nov 23 01:46:54 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Nov 23 01:46:55 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Nov 23 01:46:55 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0353] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 23 01:46:55 localhost systemd-udevd[5819]: Network interface NamePolicy= disabled on kernel command line. Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0536] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 23 01:46:55 localhost systemd[4181]: Created slice User Background Tasks Slice. Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0578] settings: (eth1): created default wired connection 'Wired connection 1' Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0583] device (eth1): carrier: link connected Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0586] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0592] policy: auto-activating connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd) Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0599] device (eth1): Activation: starting connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd) Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0601] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0605] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 23 01:46:55 localhost systemd[4181]: Starting Cleanup of User's Temporary Files and Directories... Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0611] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 23 01:46:55 localhost NetworkManager[790]: [1763880415.0616] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 23 01:46:55 localhost systemd[4181]: Finished Cleanup of User's Temporary Files and Directories. Nov 23 01:46:55 localhost sshd[5823]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:46:56 localhost systemd-logind[761]: New session 3 of user zuul. Nov 23 01:46:56 localhost systemd[1]: Started Session 3 of User zuul. Nov 23 01:46:56 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Nov 23 01:46:56 localhost python3[5840]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-7c02-8043-000000000408-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:47:09 localhost python3[5890]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:47:10 localhost python3[5933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763880429.3437893-486-217729593099398/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=985a126fed576251819cd9adb198f0955a128dcd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:47:10 localhost python3[5963]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 01:47:10 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Nov 23 01:47:10 localhost systemd[1]: Stopped Network Manager Wait Online. Nov 23 01:47:10 localhost systemd[1]: Stopping Network Manager Wait Online... Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7113] caught SIGTERM, shutting down normally. Nov 23 01:47:10 localhost systemd[1]: Stopping Network Manager... Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7219] dhcp4 (eth0): canceled DHCP transaction Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7221] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7221] dhcp4 (eth0): state changed no lease Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7226] manager: NetworkManager state is now CONNECTING Nov 23 01:47:10 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7320] dhcp4 (eth1): canceled DHCP transaction Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7321] dhcp4 (eth1): state changed no lease Nov 23 01:47:10 localhost NetworkManager[790]: [1763880430.7387] exiting (success) Nov 23 01:47:10 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 23 01:47:10 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Nov 23 01:47:10 localhost systemd[1]: Stopped Network Manager. Nov 23 01:47:10 localhost systemd[1]: NetworkManager.service: Consumed 2.331s CPU time. Nov 23 01:47:10 localhost systemd[1]: Starting Network Manager... Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.7919] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:2e694857-c83c-42a3-a300-fcad2ba2b06e) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.7923] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.7947] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 23 01:47:10 localhost systemd[1]: Started Network Manager. Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8015] manager[0x55bcf6d8a090]: monitoring kernel firmware directory '/lib/firmware'. Nov 23 01:47:10 localhost systemd[1]: Starting Network Manager Wait Online... Nov 23 01:47:10 localhost systemd[1]: Starting Hostname Service... Nov 23 01:47:10 localhost systemd[1]: Started Hostname Service. Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8868] hostname: hostname: using hostnamed Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8869] hostname: static hostname changed from (none) to "np0005532585.novalocal" Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8875] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8882] manager[0x55bcf6d8a090]: rfkill: Wi-Fi hardware radio set enabled Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8882] manager[0x55bcf6d8a090]: rfkill: WWAN hardware radio set enabled Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8921] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8922] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8923] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8923] manager: Networking is enabled by state file Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8930] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8931] settings: Loaded settings plugin: keyfile (internal) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8984] dhcp: init: Using DHCP client 'internal' Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.8989] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9000] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9009] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9022] device (lo): Activation: starting connection 'lo' (75f95e5d-82c2-442b-91f8-cfa8260985bf) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9033] device (eth0): carrier: link connected Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9042] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9057] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9058] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9072] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9084] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9093] device (eth1): carrier: link connected Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9100] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9108] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd) (indicated) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9108] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9116] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9127] device (eth1): Activation: starting connection 'Wired connection 1' (4b3cc72b-307e-3971-a752-6e4e0d8a9ddd) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9156] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9164] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9170] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9174] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9179] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9183] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9187] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9192] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9201] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9206] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9225] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9231] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9256] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9291] dhcp4 (eth0): state changed new lease, address=38.102.83.198 Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9302] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9311] device (lo): Activation: successful, device activated. Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9320] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9418] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9487] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9491] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9496] manager: NetworkManager state is now CONNECTED_SITE Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9501] device (eth0): Activation: successful, device activated. Nov 23 01:47:10 localhost NetworkManager[5975]: [1763880430.9507] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 23 01:47:11 localhost python3[6036]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-7c02-8043-00000000012b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:47:21 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 23 01:47:40 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 23 01:47:55 localhost NetworkManager[5975]: [1763880475.8228] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:55 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 23 01:47:55 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 23 01:47:55 localhost NetworkManager[5975]: [1763880475.8443] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:55 localhost NetworkManager[5975]: [1763880475.8449] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 23 01:47:55 localhost NetworkManager[5975]: [1763880475.8461] device (eth1): Activation: successful, device activated. Nov 23 01:47:55 localhost NetworkManager[5975]: [1763880475.8470] manager: startup complete Nov 23 01:47:55 localhost systemd[1]: Finished Network Manager Wait Online. Nov 23 01:48:05 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 23 01:48:11 localhost systemd[1]: session-3.scope: Deactivated successfully. Nov 23 01:48:11 localhost systemd[1]: session-3.scope: Consumed 1.492s CPU time. Nov 23 01:48:11 localhost systemd-logind[761]: Session 3 logged out. Waiting for processes to exit. Nov 23 01:48:11 localhost systemd-logind[761]: Removed session 3. Nov 23 01:48:57 localhost sshd[6063]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:48:57 localhost systemd-logind[761]: New session 4 of user zuul. Nov 23 01:48:57 localhost systemd[1]: Started Session 4 of User zuul. Nov 23 01:48:58 localhost python3[6114]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:48:58 localhost python3[6157]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880537.7609437-628-269896251414129/source _original_basename=tmpe0spzpgd follow=False checksum=492625bb7c06d655281f511b293f3f3edc954e6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:49:01 localhost systemd[1]: session-4.scope: Deactivated successfully. Nov 23 01:49:01 localhost systemd-logind[761]: Session 4 logged out. Waiting for processes to exit. Nov 23 01:49:01 localhost systemd-logind[761]: Removed session 4. Nov 23 01:49:59 localhost sshd[6173]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:52:49 localhost sshd[6177]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:53:07 localhost sshd[6179]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:54:43 localhost sshd[6181]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:54:45 localhost sshd[6184]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:54:45 localhost systemd-logind[761]: New session 5 of user zuul. Nov 23 01:54:45 localhost systemd[1]: Started Session 5 of User zuul. Nov 23 01:54:45 localhost python3[6203]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-9f63-9a03-000000001cfc-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:47 localhost python3[6222]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:47 localhost python3[6238]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:47 localhost python3[6254]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:48 localhost python3[6270]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:48 localhost python3[6286]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:50 localhost python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 01:54:50 localhost python3[6377]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763880889.8740711-642-221984898280675/source _original_basename=tmppfvd9hk7 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 01:54:52 localhost python3[6407]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 01:54:52 localhost systemd[1]: Reloading. Nov 23 01:54:52 localhost systemd-rc-local-generator[6427]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 01:54:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 01:54:53 localhost python3[6453]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Nov 23 01:54:54 localhost python3[6469]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:55 localhost python3[6487]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:55 localhost python3[6505]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:55 localhost python3[6523]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:57 localhost python3[6540]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-9f63-9a03-000000001d03-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:54:57 localhost python3[6560]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 01:55:00 localhost systemd[1]: session-5.scope: Deactivated successfully. Nov 23 01:55:00 localhost systemd[1]: session-5.scope: Consumed 3.969s CPU time. Nov 23 01:55:00 localhost systemd-logind[761]: Session 5 logged out. Waiting for processes to exit. Nov 23 01:55:00 localhost systemd-logind[761]: Removed session 5. Nov 23 01:55:47 localhost sshd[6566]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:56:14 localhost sshd[6569]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:56:14 localhost systemd[1]: Starting Cleanup of Temporary Directories... Nov 23 01:56:14 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Nov 23 01:56:14 localhost systemd[1]: Finished Cleanup of Temporary Directories. Nov 23 01:56:14 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Nov 23 01:56:14 localhost systemd-logind[761]: New session 6 of user zuul. Nov 23 01:56:14 localhost systemd[1]: Started Session 6 of User zuul. Nov 23 01:56:15 localhost systemd[1]: Starting RHSM dbus service... Nov 23 01:56:15 localhost systemd[1]: Started RHSM dbus service. Nov 23 01:56:15 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:15 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:15 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:15 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:16 localhost rhsm-service[6595]: INFO [subscription_manager.managerlib:90] Consumer created: np0005532585.novalocal (805f8986-28c6-49b9-9c1d-56700caa6ca2) Nov 23 01:56:16 localhost subscription-manager[6595]: Registered system with identity: 805f8986-28c6-49b9-9c1d-56700caa6ca2 Nov 23 01:56:16 localhost rhsm-service[6595]: INFO [subscription_manager.entcertlib:131] certs updated: Nov 23 01:56:16 localhost rhsm-service[6595]: Total updates: 1 Nov 23 01:56:16 localhost rhsm-service[6595]: Found (local) serial# [] Nov 23 01:56:16 localhost rhsm-service[6595]: Expected (UEP) serial# [5107503348184781599] Nov 23 01:56:16 localhost rhsm-service[6595]: Added (new) Nov 23 01:56:16 localhost rhsm-service[6595]: [sn:5107503348184781599 ( Content Access,) @ /etc/pki/entitlement/5107503348184781599.pem] Nov 23 01:56:16 localhost rhsm-service[6595]: Deleted (rogue): Nov 23 01:56:16 localhost rhsm-service[6595]: Nov 23 01:56:16 localhost subscription-manager[6595]: Added subscription for 'Content Access' contract 'None' Nov 23 01:56:16 localhost subscription-manager[6595]: Added subscription for product ' Content Access' Nov 23 01:56:17 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:17 localhost rhsm-service[6595]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 23 01:56:18 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:56:18 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:56:18 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:56:18 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:56:18 localhost sshd[6667]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:56:18 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:56:26 localhost python3[6688]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-d6e0-fafb-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 01:56:27 localhost python3[6707]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 01:56:34 localhost sshd[6714]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:56:58 localhost setsebool[6784]: The virt_use_nfs policy boolean was changed to 1 by root Nov 23 01:56:58 localhost setsebool[6784]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Nov 23 01:57:07 localhost kernel: SELinux: Converting 406 SID table entries... Nov 23 01:57:07 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 01:57:07 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 01:57:07 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 01:57:07 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 01:57:07 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 01:57:07 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 01:57:07 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 01:57:20 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 23 01:57:21 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 01:57:21 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 01:57:21 localhost systemd[1]: Reloading. Nov 23 01:57:21 localhost systemd-rc-local-generator[7619]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 01:57:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 01:57:21 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 01:57:22 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:57:22 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 01:57:30 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 01:57:30 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 01:57:30 localhost systemd[1]: man-db-cache-update.service: Consumed 10.932s CPU time. Nov 23 01:57:30 localhost systemd[1]: run-r33d149de15094182a0e3920cebbfb370.service: Deactivated successfully. Nov 23 01:57:34 localhost sshd[18363]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:57:56 localhost sshd[18365]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:00 localhost sshd[18367]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:07 localhost sshd[18369]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:14 localhost systemd[1]: var-lib-containers-storage-overlay-compat3400401224-merged.mount: Deactivated successfully. Nov 23 01:58:14 localhost systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1223709566-merged.mount: Deactivated successfully. Nov 23 01:58:14 localhost podman[18386]: 2025-11-23 06:58:14.36612783 +0000 UTC m=+0.110736362 system refresh Nov 23 01:58:15 localhost systemd[4181]: Starting D-Bus User Message Bus... Nov 23 01:58:15 localhost dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 23 01:58:15 localhost dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 23 01:58:15 localhost systemd[4181]: Started D-Bus User Message Bus. Nov 23 01:58:15 localhost journal[18442]: Ready Nov 23 01:58:15 localhost systemd[4181]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 23 01:58:15 localhost systemd[4181]: Created slice Slice /user. Nov 23 01:58:15 localhost systemd[4181]: podman-18426.scope: unit configures an IP firewall, but not running as root. Nov 23 01:58:15 localhost systemd[4181]: (This warning is only shown for the first unit using IP firewalling.) Nov 23 01:58:15 localhost systemd[4181]: Started podman-18426.scope. Nov 23 01:58:15 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 01:58:15 localhost systemd[4181]: Started podman-pause-1ee92a94.scope. Nov 23 01:58:17 localhost systemd[1]: session-6.scope: Deactivated successfully. Nov 23 01:58:17 localhost systemd[1]: session-6.scope: Consumed 51.522s CPU time. Nov 23 01:58:17 localhost systemd-logind[761]: Session 6 logged out. Waiting for processes to exit. Nov 23 01:58:17 localhost systemd-logind[761]: Removed session 6. Nov 23 01:58:33 localhost sshd[18446]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:33 localhost sshd[18449]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:33 localhost sshd[18447]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:33 localhost sshd[18450]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:33 localhost sshd[18448]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:38 localhost sshd[18456]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:58:38 localhost systemd-logind[761]: New session 7 of user zuul. Nov 23 01:58:38 localhost systemd[1]: Started Session 7 of User zuul. Nov 23 01:58:39 localhost python3[18473]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFp5ffFMuM9GICal8e/QjT+yRcsIbGaMWRt/HA7rb1TB5YKChpgkSmzIFogHU4gX8uce12LB+CRf7ndL6kzcKrg= zuul@np0005532578.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:58:39 localhost python3[18489]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFp5ffFMuM9GICal8e/QjT+yRcsIbGaMWRt/HA7rb1TB5YKChpgkSmzIFogHU4gX8uce12LB+CRf7ndL6kzcKrg= zuul@np0005532578.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 01:58:41 localhost systemd[1]: session-7.scope: Deactivated successfully. Nov 23 01:58:41 localhost systemd-logind[761]: Session 7 logged out. Waiting for processes to exit. Nov 23 01:58:41 localhost systemd-logind[761]: Removed session 7. Nov 23 01:59:17 localhost sshd[18490]: main: sshd: ssh-rsa algorithm is disabled Nov 23 01:59:34 localhost sshd[18492]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:00:08 localhost sshd[18495]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:00:08 localhost systemd-logind[761]: New session 8 of user zuul. Nov 23 02:00:08 localhost systemd[1]: Started Session 8 of User zuul. Nov 23 02:00:08 localhost python3[18514]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 02:00:09 localhost python3[18530]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 23 02:00:11 localhost python3[18580]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:00:11 localhost python3[18623]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763881210.7678437-135-250098084774401/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa follow=False checksum=4877a9422cfa308f85d093f4f170aa8e2f5129bc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:12 localhost python3[18685]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:00:12 localhost python3[18728]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763881212.3622243-224-85225766420296/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=4c10693c6dd746ababb601eafce6af01_id_rsa.pub follow=False checksum=9e6358c9dcdfe108c4f779a3b698bd3c9d97da46 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:14 localhost python3[18758]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:15 localhost python3[18804]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:00:16 localhost python3[18820]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpxjho5avh recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:17 localhost python3[18880]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:00:17 localhost python3[18896]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpiq_mqcca recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:19 localhost python3[18956]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:00:19 localhost python3[18972]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmp4ipv2jsb recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:00:19 localhost systemd[1]: session-8.scope: Deactivated successfully. Nov 23 02:00:19 localhost systemd[1]: session-8.scope: Consumed 3.400s CPU time. Nov 23 02:00:19 localhost systemd-logind[761]: Session 8 logged out. Waiting for processes to exit. Nov 23 02:00:19 localhost systemd-logind[761]: Removed session 8. Nov 23 02:00:21 localhost sshd[18988]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:00:55 localhost sshd[18990]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:01:10 localhost sshd[19008]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:02:26 localhost sshd[19010]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:02:32 localhost sshd[19012]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:02:32 localhost systemd-logind[761]: New session 9 of user zuul. Nov 23 02:02:32 localhost systemd[1]: Started Session 9 of User zuul. Nov 23 02:02:33 localhost python3[19058]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:02:41 localhost sshd[19060]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:04:03 localhost sshd[19063]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:04:15 localhost sshd[19066]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:04:35 localhost sshd[19068]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:05:38 localhost sshd[19070]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:05:52 localhost sshd[19072]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:06:56 localhost sshd[19074]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:07:09 localhost sshd[19078]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:07:24 localhost sshd[19080]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:07:33 localhost systemd[1]: session-9.scope: Deactivated successfully. Nov 23 02:07:33 localhost systemd-logind[761]: Session 9 logged out. Waiting for processes to exit. Nov 23 02:07:33 localhost systemd-logind[761]: Removed session 9. Nov 23 02:08:39 localhost sshd[19084]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:09:00 localhost sshd[19086]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:10:15 localhost sshd[19088]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:10:41 localhost sshd[19090]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:11:54 localhost sshd[19092]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:12:23 localhost sshd[19095]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:13:38 localhost sshd[19098]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:13:38 localhost sshd[19100]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:13:38 localhost systemd-logind[761]: New session 10 of user zuul. Nov 23 02:13:38 localhost systemd[1]: Started Session 10 of User zuul. Nov 23 02:13:38 localhost python3[19117]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:13:40 localhost python3[19137]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:13:45 localhost python3[19156]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Nov 23 02:13:48 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:13:48 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:13:59 localhost sshd[19291]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:14:07 localhost sshd[19301]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:14:42 localhost python3[19317]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Nov 23 02:14:45 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:14:45 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:14:53 localhost sshd[19443]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:14:53 localhost python3[19459]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Nov 23 02:14:56 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:14:57 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:02 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:02 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:12 localhost sshd[19839]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:15:25 localhost systemd[1]: Starting dnf makecache... Nov 23 02:15:25 localhost python3[19857]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 23 02:15:25 localhost dnf[19856]: Updating Subscription Management repositories. Nov 23 02:15:27 localhost dnf[19856]: Failed determining last makecache time. Nov 23 02:15:27 localhost dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 44 MB/s | 24 MB 00:00 Nov 23 02:15:28 localhost dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - High Av 6.0 MB/s | 2.5 MB 00:00 Nov 23 02:15:28 localhost dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 62 MB/s | 44 MB 00:00 Nov 23 02:15:29 localhost dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 27 MB/s | 14 MB 00:00 Nov 23 02:15:29 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:29 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:30 localhost dnf[19856]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 57 MB/s | 42 MB 00:00 Nov 23 02:15:30 localhost dnf[19856]: Metadata cache created. Nov 23 02:15:30 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 23 02:15:30 localhost systemd[1]: Finished dnf makecache. Nov 23 02:15:30 localhost systemd[1]: dnf-makecache.service: Consumed 3.822s CPU time. Nov 23 02:15:34 localhost sshd[20006]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:15:34 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:35 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:56 localhost python3[20216]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 23 02:15:59 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:15:59 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:16:04 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:16:04 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:16:30 localhost python3[20495]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000013-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:16:35 localhost python3[20514]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:16:49 localhost sshd[20589]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:16:53 localhost kernel: SELinux: Converting 489 SID table entries... Nov 23 02:16:53 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:16:53 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:16:53 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:16:53 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:16:53 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:16:53 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:16:53 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:16:53 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=4 res=1 Nov 23 02:16:53 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Nov 23 02:16:57 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:16:57 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:16:57 localhost systemd[1]: Reloading. Nov 23 02:16:57 localhost systemd-sysv-generator[21177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:16:57 localhost systemd-rc-local-generator[21174]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:16:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:16:57 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:16:58 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:16:58 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:16:58 localhost systemd[1]: run-rbcb47c8f41e24d0192b20b7757a07e71.service: Deactivated successfully. Nov 23 02:16:58 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:16:58 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:17:13 localhost sshd[21811]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:17:27 localhost python3[21829]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000015-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:17:55 localhost python3[21849]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:17:55 localhost python3[21897]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:17:56 localhost python3[21940]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763882275.5478146-292-75817046235764/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:17:57 localhost python3[21970]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 23 02:17:57 localhost systemd-journald[618]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Nov 23 02:17:57 localhost systemd-journald[618]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 02:17:57 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:17:57 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:17:57 localhost python3[21991]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 23 02:17:59 localhost python3[22011]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 23 02:17:59 localhost python3[22031]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 23 02:17:59 localhost python3[22051]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 23 02:18:01 localhost python3[22071]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:18:01 localhost systemd[1]: Starting LSB: Bring up/down networking... Nov 23 02:18:01 localhost network[22074]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 02:18:01 localhost network[22085]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 02:18:01 localhost network[22074]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:01 localhost network[22086]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:01 localhost network[22074]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Nov 23 02:18:01 localhost network[22087]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 02:18:01 localhost NetworkManager[5975]: [1763882281.3081] audit: op="connections-reload" pid=22115 uid=0 result="success" Nov 23 02:18:01 localhost network[22074]: Bringing up loopback interface: [ OK ] Nov 23 02:18:01 localhost NetworkManager[5975]: [1763882281.5033] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22203 uid=0 result="success" Nov 23 02:18:01 localhost network[22074]: Bringing up interface eth0: [ OK ] Nov 23 02:18:01 localhost systemd[1]: Started LSB: Bring up/down networking. Nov 23 02:18:01 localhost python3[22244]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:18:02 localhost systemd[1]: Starting Open vSwitch Database Unit... Nov 23 02:18:02 localhost chown[22248]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Nov 23 02:18:02 localhost ovs-ctl[22253]: /etc/openvswitch/conf.db does not exist ... (warning). Nov 23 02:18:02 localhost ovs-ctl[22253]: Creating empty database /etc/openvswitch/conf.db [ OK ] Nov 23 02:18:02 localhost ovs-ctl[22253]: Starting ovsdb-server [ OK ] Nov 23 02:18:02 localhost ovs-vsctl[22302]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Nov 23 02:18:02 localhost ovs-vsctl[22322]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"26f986a7-6ac7-4ec2-887b-8da6da04a661\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Nov 23 02:18:02 localhost ovs-ctl[22253]: Configuring Open vSwitch system IDs [ OK ] Nov 23 02:18:02 localhost ovs-vsctl[22328]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005532585.novalocal Nov 23 02:18:02 localhost ovs-ctl[22253]: Enabling remote OVSDB managers [ OK ] Nov 23 02:18:02 localhost systemd[1]: Started Open vSwitch Database Unit. Nov 23 02:18:02 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Nov 23 02:18:02 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Nov 23 02:18:02 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Nov 23 02:18:02 localhost kernel: openvswitch: Open vSwitch switching datapath Nov 23 02:18:02 localhost ovs-ctl[22372]: Inserting openvswitch module [ OK ] Nov 23 02:18:02 localhost ovs-ctl[22341]: Starting ovs-vswitchd [ OK ] Nov 23 02:18:02 localhost ovs-vsctl[22391]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005532585.novalocal Nov 23 02:18:02 localhost ovs-ctl[22341]: Enabling remote OVSDB managers [ OK ] Nov 23 02:18:02 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Nov 23 02:18:02 localhost systemd[1]: Starting Open vSwitch... Nov 23 02:18:02 localhost systemd[1]: Finished Open vSwitch. Nov 23 02:18:05 localhost python3[22409]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000001a-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.1673] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22567 uid=0 result="success" Nov 23 02:18:06 localhost ifup[22568]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:06 localhost ifup[22569]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:06 localhost ifup[22570]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.2017] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22576 uid=0 result="success" Nov 23 02:18:06 localhost ovs-vsctl[22578]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:15:fb:34 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Nov 23 02:18:06 localhost kernel: device ovs-system entered promiscuous mode Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.2299] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Nov 23 02:18:06 localhost systemd-udevd[22580]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:06 localhost kernel: Timeout policy base is empty Nov 23 02:18:06 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Nov 23 02:18:06 localhost kernel: device br-ex entered promiscuous mode Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.2742] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.2996] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22606 uid=0 result="success" Nov 23 02:18:06 localhost NetworkManager[5975]: [1763882286.3199] device (br-ex): carrier: link connected Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.3735] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22635 uid=0 result="success" Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.4191] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22650 uid=0 result="success" Nov 23 02:18:09 localhost NET[22675]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5084] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5289] dhcp4 (eth1): canceled DHCP transaction Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5289] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5290] dhcp4 (eth1): state changed no lease Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5331] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22684 uid=0 result="success" Nov 23 02:18:09 localhost ifup[22685]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:09 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 23 02:18:09 localhost ifup[22686]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:09 localhost ifup[22688]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:09 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.5702] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22701 uid=0 result="success" Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.6157] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22712 uid=0 result="success" Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.6227] device (eth1): carrier: link connected Nov 23 02:18:09 localhost NetworkManager[5975]: [1763882289.6450] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22721 uid=0 result="success" Nov 23 02:18:09 localhost ipv6_wait_tentative[22733]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 23 02:18:10 localhost ipv6_wait_tentative[22738]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.7151] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22747 uid=0 result="success" Nov 23 02:18:11 localhost ovs-vsctl[22762]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Nov 23 02:18:11 localhost kernel: device eth1 entered promiscuous mode Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.7846] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22770 uid=0 result="success" Nov 23 02:18:11 localhost ifup[22771]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:11 localhost ifup[22772]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:11 localhost ifup[22773]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.8151] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22779 uid=0 result="success" Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.8554] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22789 uid=0 result="success" Nov 23 02:18:11 localhost ifup[22790]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:11 localhost ifup[22791]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:11 localhost ifup[22792]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.8850] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22798 uid=0 result="success" Nov 23 02:18:11 localhost ovs-vsctl[22801]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 23 02:18:11 localhost kernel: device vlan20 entered promiscuous mode Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.9226] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Nov 23 02:18:11 localhost systemd-udevd[22803]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.9480] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22812 uid=0 result="success" Nov 23 02:18:11 localhost NetworkManager[5975]: [1763882291.9694] device (vlan20): carrier: link connected Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.0147] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22841 uid=0 result="success" Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.0544] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22856 uid=0 result="success" Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.1096] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22877 uid=0 result="success" Nov 23 02:18:15 localhost ifup[22878]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:15 localhost ifup[22879]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:15 localhost ifup[22880]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.1395] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22886 uid=0 result="success" Nov 23 02:18:15 localhost ovs-vsctl[22889]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 23 02:18:15 localhost kernel: device vlan44 entered promiscuous mode Nov 23 02:18:15 localhost systemd-udevd[22891]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.1805] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.2087] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22901 uid=0 result="success" Nov 23 02:18:15 localhost NetworkManager[5975]: [1763882295.2298] device (vlan44): carrier: link connected Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.2785] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22931 uid=0 result="success" Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.3270] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22946 uid=0 result="success" Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.3807] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22967 uid=0 result="success" Nov 23 02:18:18 localhost ifup[22968]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:18 localhost ifup[22969]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:18 localhost ifup[22970]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.4124] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22976 uid=0 result="success" Nov 23 02:18:18 localhost ovs-vsctl[22979]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 23 02:18:18 localhost systemd-udevd[22981]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:18 localhost kernel: device vlan22 entered promiscuous mode Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.4481] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.4727] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22991 uid=0 result="success" Nov 23 02:18:18 localhost NetworkManager[5975]: [1763882298.4887] device (vlan22): carrier: link connected Nov 23 02:18:19 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.5420] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23021 uid=0 result="success" Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.5885] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23036 uid=0 result="success" Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.6381] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23057 uid=0 result="success" Nov 23 02:18:21 localhost ifup[23058]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:21 localhost ifup[23059]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:21 localhost ifup[23060]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.6655] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23066 uid=0 result="success" Nov 23 02:18:21 localhost ovs-vsctl[23069]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 23 02:18:21 localhost kernel: device vlan23 entered promiscuous mode Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.7311] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Nov 23 02:18:21 localhost systemd-udevd[23071]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.7546] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23081 uid=0 result="success" Nov 23 02:18:21 localhost NetworkManager[5975]: [1763882301.7738] device (vlan23): carrier: link connected Nov 23 02:18:24 localhost NetworkManager[5975]: [1763882304.8230] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23111 uid=0 result="success" Nov 23 02:18:24 localhost NetworkManager[5975]: [1763882304.8706] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23126 uid=0 result="success" Nov 23 02:18:24 localhost NetworkManager[5975]: [1763882304.9294] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23147 uid=0 result="success" Nov 23 02:18:24 localhost ifup[23148]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:24 localhost ifup[23149]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:24 localhost ifup[23150]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:24 localhost NetworkManager[5975]: [1763882304.9614] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23156 uid=0 result="success" Nov 23 02:18:24 localhost ovs-vsctl[23159]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 23 02:18:25 localhost kernel: device vlan21 entered promiscuous mode Nov 23 02:18:25 localhost systemd-udevd[23161]: Network interface NamePolicy= disabled on kernel command line. Nov 23 02:18:25 localhost NetworkManager[5975]: [1763882305.0011] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Nov 23 02:18:25 localhost NetworkManager[5975]: [1763882305.0270] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23171 uid=0 result="success" Nov 23 02:18:25 localhost NetworkManager[5975]: [1763882305.0481] device (vlan21): carrier: link connected Nov 23 02:18:28 localhost NetworkManager[5975]: [1763882308.1031] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23201 uid=0 result="success" Nov 23 02:18:28 localhost NetworkManager[5975]: [1763882308.1430] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23216 uid=0 result="success" Nov 23 02:18:28 localhost NetworkManager[5975]: [1763882308.1909] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23237 uid=0 result="success" Nov 23 02:18:28 localhost ifup[23238]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:28 localhost ifup[23239]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:28 localhost ifup[23240]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:28 localhost NetworkManager[5975]: [1763882308.2153] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23246 uid=0 result="success" Nov 23 02:18:28 localhost ovs-vsctl[23249]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 23 02:18:28 localhost NetworkManager[5975]: [1763882308.2955] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23256 uid=0 result="success" Nov 23 02:18:29 localhost sshd[23274]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:18:29 localhost NetworkManager[5975]: [1763882309.3565] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23285 uid=0 result="success" Nov 23 02:18:29 localhost NetworkManager[5975]: [1763882309.4025] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23300 uid=0 result="success" Nov 23 02:18:29 localhost NetworkManager[5975]: [1763882309.4593] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23321 uid=0 result="success" Nov 23 02:18:29 localhost ifup[23322]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:29 localhost ifup[23323]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:29 localhost ifup[23324]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:29 localhost NetworkManager[5975]: [1763882309.4912] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23330 uid=0 result="success" Nov 23 02:18:29 localhost ovs-vsctl[23333]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 23 02:18:29 localhost NetworkManager[5975]: [1763882309.5494] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23340 uid=0 result="success" Nov 23 02:18:30 localhost NetworkManager[5975]: [1763882310.6114] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23368 uid=0 result="success" Nov 23 02:18:30 localhost NetworkManager[5975]: [1763882310.6611] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23383 uid=0 result="success" Nov 23 02:18:30 localhost NetworkManager[5975]: [1763882310.7207] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23404 uid=0 result="success" Nov 23 02:18:30 localhost ifup[23405]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:30 localhost ifup[23406]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:30 localhost ifup[23407]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:30 localhost NetworkManager[5975]: [1763882310.7540] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23413 uid=0 result="success" Nov 23 02:18:30 localhost ovs-vsctl[23416]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 23 02:18:30 localhost NetworkManager[5975]: [1763882310.8120] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23423 uid=0 result="success" Nov 23 02:18:31 localhost NetworkManager[5975]: [1763882311.8707] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23451 uid=0 result="success" Nov 23 02:18:31 localhost NetworkManager[5975]: [1763882311.9164] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23466 uid=0 result="success" Nov 23 02:18:31 localhost NetworkManager[5975]: [1763882311.9742] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23487 uid=0 result="success" Nov 23 02:18:31 localhost ifup[23488]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:31 localhost ifup[23489]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:31 localhost ifup[23490]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:32 localhost NetworkManager[5975]: [1763882312.0064] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23496 uid=0 result="success" Nov 23 02:18:32 localhost ovs-vsctl[23499]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 23 02:18:32 localhost NetworkManager[5975]: [1763882312.0986] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23506 uid=0 result="success" Nov 23 02:18:33 localhost NetworkManager[5975]: [1763882313.1557] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23534 uid=0 result="success" Nov 23 02:18:33 localhost NetworkManager[5975]: [1763882313.2009] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23549 uid=0 result="success" Nov 23 02:18:33 localhost NetworkManager[5975]: [1763882313.2567] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23570 uid=0 result="success" Nov 23 02:18:33 localhost ifup[23571]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 23 02:18:33 localhost ifup[23572]: 'network-scripts' will be removed from distribution in near future. Nov 23 02:18:33 localhost ifup[23573]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 23 02:18:33 localhost NetworkManager[5975]: [1763882313.2875] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23579 uid=0 result="success" Nov 23 02:18:33 localhost ovs-vsctl[23582]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 23 02:18:33 localhost NetworkManager[5975]: [1763882313.3785] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23589 uid=0 result="success" Nov 23 02:18:34 localhost NetworkManager[5975]: [1763882314.4336] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23617 uid=0 result="success" Nov 23 02:18:34 localhost NetworkManager[5975]: [1763882314.4813] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23632 uid=0 result="success" Nov 23 02:18:49 localhost sshd[23650]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:19:27 localhost python3[23667]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-00000000001b-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:19:33 localhost python3[23686]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 02:19:33 localhost python3[23702]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 02:19:35 localhost python3[23716]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 02:19:35 localhost python3[23732]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 23 02:19:36 localhost python3[23746]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Nov 23 02:19:37 localhost python3[23761]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005532585.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000022-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:19:38 localhost python3[23781]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-1b67-9ac8-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:19:38 localhost systemd[1]: Starting Hostname Service... Nov 23 02:19:38 localhost systemd[1]: Started Hostname Service. Nov 23 02:19:38 localhost systemd-hostnamed[23785]: Hostname set to (static) Nov 23 02:19:38 localhost NetworkManager[5975]: [1763882378.6039] hostname: static hostname changed from "np0005532585.novalocal" to "np0005532585.localdomain" Nov 23 02:19:38 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 23 02:19:38 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 23 02:19:40 localhost systemd-logind[761]: Session 10 logged out. Waiting for processes to exit. Nov 23 02:19:40 localhost systemd[1]: session-10.scope: Deactivated successfully. Nov 23 02:19:40 localhost systemd[1]: session-10.scope: Consumed 1min 43.396s CPU time. Nov 23 02:19:40 localhost systemd-logind[761]: Removed session 10. Nov 23 02:19:42 localhost sshd[23796]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:19:43 localhost systemd-logind[761]: New session 11 of user zuul. Nov 23 02:19:43 localhost systemd[1]: Started Session 11 of User zuul. Nov 23 02:19:43 localhost python3[23813]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 23 02:19:45 localhost systemd[1]: session-11.scope: Deactivated successfully. Nov 23 02:19:45 localhost systemd-logind[761]: Session 11 logged out. Waiting for processes to exit. Nov 23 02:19:45 localhost systemd-logind[761]: Removed session 11. Nov 23 02:19:48 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 23 02:20:08 localhost sshd[23815]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:20:08 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 23 02:20:23 localhost sshd[23820]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:20:31 localhost sshd[23822]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:20:32 localhost systemd-logind[761]: New session 12 of user zuul. Nov 23 02:20:32 localhost systemd[1]: Started Session 12 of User zuul. Nov 23 02:20:32 localhost python3[23841]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:20:36 localhost systemd[1]: Reloading. Nov 23 02:20:36 localhost systemd-rc-local-generator[23879]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:20:36 localhost systemd-sysv-generator[23884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:20:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:20:36 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Nov 23 02:20:36 localhost systemd[1]: Reloading. Nov 23 02:20:36 localhost systemd-rc-local-generator[23923]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:20:36 localhost systemd-sysv-generator[23926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:20:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:20:36 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Nov 23 02:20:36 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Nov 23 02:20:36 localhost systemd[1]: Reloading. Nov 23 02:20:36 localhost systemd-sysv-generator[23969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:20:36 localhost systemd-rc-local-generator[23966]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:20:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:20:37 localhost systemd[1]: Listening on LVM2 poll daemon socket. Nov 23 02:20:37 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:20:37 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:20:37 localhost systemd[1]: Reloading. Nov 23 02:20:37 localhost systemd-sysv-generator[24015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:20:37 localhost systemd-rc-local-generator[24012]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:20:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:20:37 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:20:37 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:20:37 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:20:37 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:20:37 localhost systemd[1]: run-rf59b1884ed1f41a0bec64077d7b2621b.service: Deactivated successfully. Nov 23 02:20:37 localhost systemd[1]: run-r97bfd8361b184554866928a6d999f465.service: Deactivated successfully. Nov 23 02:21:38 localhost systemd[1]: session-12.scope: Deactivated successfully. Nov 23 02:21:38 localhost systemd[1]: session-12.scope: Consumed 4.652s CPU time. Nov 23 02:21:38 localhost systemd-logind[761]: Session 12 logged out. Waiting for processes to exit. Nov 23 02:21:38 localhost systemd-logind[761]: Removed session 12. Nov 23 02:21:49 localhost sshd[24615]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:21:58 localhost sshd[24617]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:23:29 localhost sshd[24619]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:23:37 localhost sshd[24621]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:25:13 localhost sshd[24624]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:25:14 localhost sshd[24626]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:25:30 localhost sshd[24628]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:26:50 localhost sshd[24630]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:26:55 localhost sshd[24632]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:28:23 localhost sshd[24636]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:28:40 localhost sshd[24638]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:30:02 localhost sshd[24642]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:30:24 localhost sshd[24645]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:31:14 localhost sshd[24647]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:31:41 localhost sshd[24649]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:32:06 localhost sshd[24651]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:33:15 localhost sshd[24653]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:33:39 localhost sshd[24655]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:34:49 localhost sshd[24657]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:35:12 localhost sshd[24661]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:36:24 localhost sshd[24664]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:36:38 localhost sshd[24666]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:36:50 localhost sshd[24668]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:37:15 localhost sshd[24670]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:37:15 localhost systemd-logind[761]: New session 13 of user zuul. Nov 23 02:37:15 localhost systemd[1]: Started Session 13 of User zuul. Nov 23 02:37:15 localhost python3[24718]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 02:37:17 localhost python3[24805]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:37:20 localhost python3[24822]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:37:21 localhost python3[24838]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:21 localhost kernel: loop: module loaded Nov 23 02:37:21 localhost kernel: loop3: detected capacity change from 0 to 14680064 Nov 23 02:37:21 localhost python3[24863]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:21 localhost lvm[24866]: PV /dev/loop3 not used. Nov 23 02:37:21 localhost lvm[24868]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 23 02:37:21 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Nov 23 02:37:21 localhost lvm[24878]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 23 02:37:21 localhost lvm[24878]: VG ceph_vg0 finished Nov 23 02:37:21 localhost lvm[24877]: 1 logical volume(s) in volume group "ceph_vg0" now active Nov 23 02:37:22 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Nov 23 02:37:22 localhost python3[24926]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:37:23 localhost python3[24969]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883442.2949338-55245-19214359527875/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:23 localhost python3[24999]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:37:23 localhost systemd[1]: Reloading. Nov 23 02:37:24 localhost systemd-rc-local-generator[25023]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:37:24 localhost systemd-sysv-generator[25027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:37:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:37:24 localhost systemd[1]: Starting Ceph OSD losetup... Nov 23 02:37:24 localhost bash[25039]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img) Nov 23 02:37:24 localhost systemd[1]: Finished Ceph OSD losetup. Nov 23 02:37:24 localhost lvm[25040]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 23 02:37:24 localhost lvm[25040]: VG ceph_vg0 finished Nov 23 02:37:24 localhost python3[25057]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:37:27 localhost python3[25074]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:37:28 localhost python3[25090]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:28 localhost kernel: loop4: detected capacity change from 0 to 14680064 Nov 23 02:37:28 localhost python3[25112]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:28 localhost lvm[25115]: PV /dev/loop4 not used. Nov 23 02:37:29 localhost lvm[25125]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 23 02:37:29 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Nov 23 02:37:29 localhost lvm[25127]: 1 logical volume(s) in volume group "ceph_vg1" now active Nov 23 02:37:29 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Nov 23 02:37:29 localhost python3[25175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:37:30 localhost python3[25218]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883449.3683252-55449-238753956326735/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:30 localhost python3[25248]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:37:30 localhost systemd[1]: Reloading. Nov 23 02:37:30 localhost systemd-sysv-generator[25274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:37:30 localhost systemd-rc-local-generator[25271]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:37:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:37:31 localhost systemd[1]: Starting Ceph OSD losetup... Nov 23 02:37:31 localhost bash[25289]: /dev/loop4: [64516]:8399529 (/var/lib/ceph-osd-1.img) Nov 23 02:37:31 localhost systemd[1]: Finished Ceph OSD losetup. Nov 23 02:37:31 localhost lvm[25290]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 23 02:37:31 localhost lvm[25290]: VG ceph_vg1 finished Nov 23 02:37:40 localhost python3[25335]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 23 02:37:41 localhost python3[25355]: ansible-hostname Invoked with name=np0005532585.localdomain use=None Nov 23 02:37:41 localhost systemd[1]: Starting Hostname Service... Nov 23 02:37:42 localhost systemd[1]: Started Hostname Service. Nov 23 02:37:44 localhost python3[25378]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 23 02:37:44 localhost python3[25426]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.kt75qvnutmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:45 localhost python3[25456]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.kt75qvnutmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:45 localhost python3[25472]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.kt75qvnutmphosts insertbefore=BOF block=192.168.122.106 np0005532584.localdomain np0005532584#012192.168.122.106 np0005532584.ctlplane.localdomain np0005532584.ctlplane#012192.168.122.107 np0005532585.localdomain np0005532585#012192.168.122.107 np0005532585.ctlplane.localdomain np0005532585.ctlplane#012192.168.122.108 np0005532586.localdomain np0005532586#012192.168.122.108 np0005532586.ctlplane.localdomain np0005532586.ctlplane#012192.168.122.103 np0005532581.localdomain np0005532581#012192.168.122.103 np0005532581.ctlplane.localdomain np0005532581.ctlplane#012192.168.122.104 np0005532582.localdomain np0005532582#012192.168.122.104 np0005532582.ctlplane.localdomain np0005532582.ctlplane#012192.168.122.105 np0005532583.localdomain np0005532583#012192.168.122.105 np0005532583.ctlplane.localdomain np0005532583.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:46 localhost python3[25488]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.kt75qvnutmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:46 localhost python3[25505]: ansible-file Invoked with path=/tmp/ansible.kt75qvnutmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:48 localhost python3[25521]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:50 localhost python3[25539]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:37:54 localhost python3[25588]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:37:54 localhost python3[25633]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883474.032404-56296-69872870508002/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:56 localhost python3[25663]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:37:56 localhost python3[25681]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:37:56 localhost chronyd[767]: chronyd exiting Nov 23 02:37:56 localhost systemd[1]: Stopping NTP client/server... Nov 23 02:37:56 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 23 02:37:56 localhost systemd[1]: Stopped NTP client/server. Nov 23 02:37:56 localhost systemd[1]: chronyd.service: Consumed 116ms CPU time, read 1.9M from disk, written 0B to disk. Nov 23 02:37:56 localhost systemd[1]: Starting NTP client/server... Nov 23 02:37:56 localhost chronyd[25688]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 23 02:37:56 localhost chronyd[25688]: Frequency -30.271 +/- 0.118 ppm read from /var/lib/chrony/drift Nov 23 02:37:56 localhost chronyd[25688]: Loaded seccomp filter (level 2) Nov 23 02:37:56 localhost systemd[1]: Started NTP client/server. Nov 23 02:37:57 localhost python3[25737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:37:58 localhost python3[25780]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763883477.4283235-56485-280233936923020/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:37:58 localhost python3[25810]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:37:58 localhost systemd[1]: Reloading. Nov 23 02:37:58 localhost systemd-rc-local-generator[25834]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:37:58 localhost systemd-sysv-generator[25839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:37:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:37:58 localhost systemd[1]: Reloading. Nov 23 02:37:59 localhost systemd-sysv-generator[25877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:37:59 localhost systemd-rc-local-generator[25874]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:37:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:37:59 localhost systemd[1]: Starting chronyd online sources service... Nov 23 02:37:59 localhost chronyc[25887]: 200 OK Nov 23 02:37:59 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 23 02:37:59 localhost systemd[1]: Finished chronyd online sources service. Nov 23 02:37:59 localhost python3[25903]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:37:59 localhost chronyd[25688]: System clock was stepped by 0.000000 seconds Nov 23 02:38:00 localhost python3[25920]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:38:01 localhost chronyd[25688]: Selected source 23.133.168.245 (pool.ntp.org) Nov 23 02:38:10 localhost python3[25937]: ansible-timezone Invoked with name=UTC hwclock=None Nov 23 02:38:10 localhost systemd[1]: Starting Time & Date Service... Nov 23 02:38:10 localhost systemd[1]: Started Time & Date Service. Nov 23 02:38:12 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 23 02:38:12 localhost python3[25960]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:38:12 localhost chronyd[25688]: chronyd exiting Nov 23 02:38:12 localhost systemd[1]: Stopping NTP client/server... Nov 23 02:38:12 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 23 02:38:12 localhost systemd[1]: Stopped NTP client/server. Nov 23 02:38:12 localhost systemd[1]: Starting NTP client/server... Nov 23 02:38:12 localhost chronyd[25967]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 23 02:38:12 localhost chronyd[25967]: Frequency -30.271 +/- 0.125 ppm read from /var/lib/chrony/drift Nov 23 02:38:12 localhost chronyd[25967]: Loaded seccomp filter (level 2) Nov 23 02:38:12 localhost systemd[1]: Started NTP client/server. Nov 23 02:38:17 localhost chronyd[25967]: Selected source 23.133.168.245 (pool.ntp.org) Nov 23 02:38:23 localhost sshd[25969]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:38:25 localhost sshd[25971]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:38:25 localhost sshd[25973]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:38:40 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 23 02:39:56 localhost sshd[26170]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:04 localhost sshd[26172]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:14 localhost sshd[26174]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:14 localhost systemd-logind[761]: New session 14 of user ceph-admin. Nov 23 02:40:14 localhost systemd[1]: Created slice User Slice of UID 1002. Nov 23 02:40:14 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Nov 23 02:40:14 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Nov 23 02:40:14 localhost systemd[1]: Starting User Manager for UID 1002... Nov 23 02:40:14 localhost sshd[26191]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:14 localhost systemd[26178]: Queued start job for default target Main User Target. Nov 23 02:40:14 localhost systemd[26178]: Created slice User Application Slice. Nov 23 02:40:14 localhost systemd[26178]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 02:40:14 localhost systemd[26178]: Started Daily Cleanup of User's Temporary Directories. Nov 23 02:40:14 localhost systemd[26178]: Reached target Paths. Nov 23 02:40:14 localhost systemd[26178]: Reached target Timers. Nov 23 02:40:14 localhost systemd[26178]: Starting D-Bus User Message Bus Socket... Nov 23 02:40:14 localhost systemd[26178]: Starting Create User's Volatile Files and Directories... Nov 23 02:40:14 localhost systemd[26178]: Finished Create User's Volatile Files and Directories. Nov 23 02:40:14 localhost systemd[26178]: Listening on D-Bus User Message Bus Socket. Nov 23 02:40:14 localhost systemd[26178]: Reached target Sockets. Nov 23 02:40:14 localhost systemd[26178]: Reached target Basic System. Nov 23 02:40:14 localhost systemd[26178]: Reached target Main User Target. Nov 23 02:40:14 localhost systemd[26178]: Startup finished in 105ms. Nov 23 02:40:14 localhost systemd[1]: Started User Manager for UID 1002. Nov 23 02:40:14 localhost systemd[1]: Started Session 14 of User ceph-admin. Nov 23 02:40:14 localhost systemd-logind[761]: New session 16 of user ceph-admin. Nov 23 02:40:14 localhost systemd[1]: Started Session 16 of User ceph-admin. Nov 23 02:40:15 localhost sshd[26213]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:15 localhost systemd-logind[761]: New session 17 of user ceph-admin. Nov 23 02:40:15 localhost systemd[1]: Started Session 17 of User ceph-admin. Nov 23 02:40:15 localhost sshd[26232]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:15 localhost systemd-logind[761]: New session 18 of user ceph-admin. Nov 23 02:40:15 localhost systemd[1]: Started Session 18 of User ceph-admin. Nov 23 02:40:15 localhost sshd[26251]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:15 localhost systemd-logind[761]: New session 19 of user ceph-admin. Nov 23 02:40:15 localhost systemd[1]: Started Session 19 of User ceph-admin. Nov 23 02:40:16 localhost sshd[26270]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:16 localhost systemd-logind[761]: New session 20 of user ceph-admin. Nov 23 02:40:16 localhost systemd[1]: Started Session 20 of User ceph-admin. Nov 23 02:40:16 localhost sshd[26289]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:16 localhost systemd-logind[761]: New session 21 of user ceph-admin. Nov 23 02:40:16 localhost systemd[1]: Started Session 21 of User ceph-admin. Nov 23 02:40:16 localhost sshd[26308]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:17 localhost systemd-logind[761]: New session 22 of user ceph-admin. Nov 23 02:40:17 localhost systemd[1]: Started Session 22 of User ceph-admin. Nov 23 02:40:17 localhost sshd[26327]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:17 localhost systemd-logind[761]: New session 23 of user ceph-admin. Nov 23 02:40:17 localhost systemd[1]: Started Session 23 of User ceph-admin. Nov 23 02:40:17 localhost sshd[26346]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:17 localhost systemd-logind[761]: New session 24 of user ceph-admin. Nov 23 02:40:17 localhost systemd[1]: Started Session 24 of User ceph-admin. Nov 23 02:40:18 localhost sshd[26363]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:18 localhost systemd-logind[761]: New session 25 of user ceph-admin. Nov 23 02:40:18 localhost systemd[1]: Started Session 25 of User ceph-admin. Nov 23 02:40:18 localhost sshd[26382]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:40:18 localhost systemd-logind[761]: New session 26 of user ceph-admin. Nov 23 02:40:18 localhost systemd[1]: Started Session 26 of User ceph-admin. Nov 23 02:40:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:44 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26596 (sysctl) Nov 23 02:40:44 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Nov 23 02:40:44 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Nov 23 02:40:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:40:49 localhost kernel: VFS: idmapped mount is not enabled. Nov 23 02:41:07 localhost podman[26736]: Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:41:07.82143409 +0000 UTC m=+22.107074887 container create 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=553, ceph=True) Nov 23 02:41:07 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck12222527-merged.mount: Deactivated successfully. Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:40:45.756348559 +0000 UTC m=+0.041989386 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:07 localhost systemd[1]: Created slice Slice /machine. Nov 23 02:41:07 localhost systemd[1]: Started libpod-conmon-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope. Nov 23 02:41:07 localhost systemd[1]: Started libcrun container. Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:41:07.926179671 +0000 UTC m=+22.211820488 container init 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:41:07.936262796 +0000 UTC m=+22.221903593 container start 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph) Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:41:07.936687629 +0000 UTC m=+22.222328506 container attach 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=) Nov 23 02:41:07 localhost nifty_ishizaka[26881]: 167 167 Nov 23 02:41:07 localhost systemd[1]: libpod-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope: Deactivated successfully. Nov 23 02:41:07 localhost podman[26736]: 2025-11-23 07:41:07.94098737 +0000 UTC m=+22.226628217 container died 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Nov 23 02:41:08 localhost podman[26886]: 2025-11-23 07:41:08.022960059 +0000 UTC m=+0.073125441 container remove 3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_ishizaka, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, architecture=x86_64) Nov 23 02:41:08 localhost systemd[1]: libpod-conmon-3b92ea526d77a9da7dd8522f00105294991cc7fd3b4fb6800147cd0cdc8f1126.scope: Deactivated successfully. Nov 23 02:41:08 localhost podman[26908]: Nov 23 02:41:08 localhost podman[26908]: 2025-11-23 07:41:08.252101376 +0000 UTC m=+0.062417656 container create a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 23 02:41:08 localhost systemd[1]: Started libpod-conmon-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope. Nov 23 02:41:08 localhost systemd[1]: Started libcrun container. Nov 23 02:41:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:08 localhost podman[26908]: 2025-11-23 07:41:08.324971078 +0000 UTC m=+0.135287348 container init a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553) Nov 23 02:41:08 localhost podman[26908]: 2025-11-23 07:41:08.231344896 +0000 UTC m=+0.041661176 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:08 localhost podman[26908]: 2025-11-23 07:41:08.33162318 +0000 UTC m=+0.141939450 container start a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public) Nov 23 02:41:08 localhost podman[26908]: 2025-11-23 07:41:08.331791705 +0000 UTC m=+0.142108035 container attach a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public) Nov 23 02:41:08 localhost systemd[1]: tmp-crun.sbBVmZ.mount: Deactivated successfully. Nov 23 02:41:08 localhost systemd[1]: var-lib-containers-storage-overlay-e138ded2b6606e6b477798be8b9f6e3b9a918a0b2e76585a28268ec882051d3f-merged.mount: Deactivated successfully. Nov 23 02:41:09 localhost dreamy_faraday[26930]: [ Nov 23 02:41:09 localhost dreamy_faraday[26930]: { Nov 23 02:41:09 localhost dreamy_faraday[26930]: "available": false, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "ceph_device": false, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "lsm_data": {}, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "lvs": [], Nov 23 02:41:09 localhost dreamy_faraday[26930]: "path": "/dev/sr0", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "rejected_reasons": [ Nov 23 02:41:09 localhost dreamy_faraday[26930]: "Has a FileSystem", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "Insufficient space (<5GB)" Nov 23 02:41:09 localhost dreamy_faraday[26930]: ], Nov 23 02:41:09 localhost dreamy_faraday[26930]: "sys_api": { Nov 23 02:41:09 localhost dreamy_faraday[26930]: "actuators": null, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "device_nodes": "sr0", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "human_readable_size": "482.00 KB", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "id_bus": "ata", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "model": "QEMU DVD-ROM", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "nr_requests": "2", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "partitions": {}, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "path": "/dev/sr0", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "removable": "1", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "rev": "2.5+", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "ro": "0", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "rotational": "1", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "sas_address": "", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "sas_device_handle": "", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "scheduler_mode": "mq-deadline", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "sectors": 0, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "sectorsize": "2048", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "size": 493568.0, Nov 23 02:41:09 localhost dreamy_faraday[26930]: "support_discard": "0", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "type": "disk", Nov 23 02:41:09 localhost dreamy_faraday[26930]: "vendor": "QEMU" Nov 23 02:41:09 localhost dreamy_faraday[26930]: } Nov 23 02:41:09 localhost dreamy_faraday[26930]: } Nov 23 02:41:09 localhost dreamy_faraday[26930]: ] Nov 23 02:41:09 localhost systemd[1]: libpod-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope: Deactivated successfully. Nov 23 02:41:09 localhost podman[26908]: 2025-11-23 07:41:09.155846585 +0000 UTC m=+0.966162895 container died a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-09-24T08:57:55, release=553, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, architecture=x86_64) Nov 23 02:41:09 localhost systemd[1]: var-lib-containers-storage-overlay-d501d3ca55a33bbd6dcd98428057573660dc63e7dd8e73144372362de04e8119-merged.mount: Deactivated successfully. Nov 23 02:41:09 localhost podman[28381]: 2025-11-23 07:41:09.248088636 +0000 UTC m=+0.079034252 container remove a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_faraday, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Nov 23 02:41:09 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:09 localhost systemd[1]: libpod-conmon-a311259fa434ecf3f50cf7dccfd97c0973f45108e434d9db04a0b0a063115647.scope: Deactivated successfully. Nov 23 02:41:09 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Nov 23 02:41:09 localhost systemd[1]: Closed Process Core Dump Socket. Nov 23 02:41:09 localhost systemd[1]: Stopping Process Core Dump Socket... Nov 23 02:41:09 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 23 02:41:09 localhost systemd[1]: Reloading. Nov 23 02:41:09 localhost systemd-sysv-generator[28488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:09 localhost systemd-rc-local-generator[28485]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:10 localhost systemd[1]: Reloading. Nov 23 02:41:10 localhost systemd-rc-local-generator[28526]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:10 localhost systemd-sysv-generator[28531]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:11 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 02:41:33 localhost sshd[28794]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:41:42 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:42 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:43 localhost sshd[28873]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:41:43 localhost podman[28866]: Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.059477627 +0000 UTC m=+0.062543223 container create 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git) Nov 23 02:41:43 localhost systemd[1]: Started libpod-conmon-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope. Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.028339115 +0000 UTC m=+0.031404691 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:43 localhost systemd[1]: Started libcrun container. Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.143700861 +0000 UTC m=+0.146766417 container init 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.153487082 +0000 UTC m=+0.156552638 container start 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.153615056 +0000 UTC m=+0.156680622 container attach 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Nov 23 02:41:43 localhost wonderful_austin[28882]: 167 167 Nov 23 02:41:43 localhost systemd[1]: libpod-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope: Deactivated successfully. Nov 23 02:41:43 localhost podman[28866]: 2025-11-23 07:41:43.157265079 +0000 UTC m=+0.160330655 container died 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, ceph=True, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:41:43 localhost podman[28887]: 2025-11-23 07:41:43.216906762 +0000 UTC m=+0.049829893 container remove 88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_austin, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 02:41:43 localhost systemd[1]: libpod-conmon-88da6a3f7972083b32127ec06144bab9fa8746b6d15103a5cc3815986575d7e6.scope: Deactivated successfully. Nov 23 02:41:43 localhost systemd[1]: Reloading. Nov 23 02:41:43 localhost systemd-sysv-generator[28933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:43 localhost systemd-rc-local-generator[28927]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:43 localhost systemd[1]: Reloading. Nov 23 02:41:43 localhost systemd-rc-local-generator[28964]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:43 localhost systemd-sysv-generator[28969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:43 localhost sshd[28976]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:41:43 localhost systemd[1]: Reached target All Ceph clusters and services. Nov 23 02:41:43 localhost systemd[1]: Reloading. Nov 23 02:41:43 localhost systemd-rc-local-generator[29005]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:43 localhost systemd-sysv-generator[29010]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:43 localhost systemd[1]: Reached target Ceph cluster 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 02:41:43 localhost systemd[1]: Reloading. Nov 23 02:41:44 localhost systemd-rc-local-generator[29045]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:44 localhost systemd-sysv-generator[29049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:44 localhost systemd[1]: Reloading. Nov 23 02:41:44 localhost systemd-rc-local-generator[29086]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:44 localhost systemd-sysv-generator[29090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:44 localhost systemd[1]: Created slice Slice /system/ceph-46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 02:41:44 localhost systemd[1]: Reached target System Time Set. Nov 23 02:41:44 localhost systemd[1]: Reached target System Time Synchronized. Nov 23 02:41:44 localhost systemd[1]: Starting Ceph crash.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 02:41:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 23 02:41:44 localhost podman[29146]: Nov 23 02:41:44 localhost podman[29146]: 2025-11-23 07:41:44.743624465 +0000 UTC m=+0.058457115 container create 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=553, architecture=x86_64, io.buildah.version=1.33.12, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Nov 23 02:41:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/etc/ceph/ceph.client.crash.np0005532585.keyring supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:44 localhost podman[29146]: 2025-11-23 07:41:44.713804658 +0000 UTC m=+0.028637328 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54336affea3174046a26e5cc0b745b5cc6b278f9d4d0515655329cf08f000053/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:44 localhost podman[29146]: 2025-11-23 07:41:44.832225737 +0000 UTC m=+0.147058387 container init 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True) Nov 23 02:41:44 localhost podman[29146]: 2025-11-23 07:41:44.842580717 +0000 UTC m=+0.157413397 container start 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, name=rhceph, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:41:44 localhost bash[29146]: 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee Nov 23 02:41:44 localhost systemd[1]: Started Ceph crash.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 02:41:44 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: INFO:ceph-crash:pinging cluster to exercise our key Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.005+0000 7fba80027640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.005+0000 7fba80027640 -1 AuthRegistry(0x7fba780680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.006+0000 7fba80027640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.006+0000 7fba80027640 -1 AuthRegistry(0x7fba80026000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.015+0000 7fba7d59b640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.016+0000 7fba7e59d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.017+0000 7fba7dd9c640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: 2025-11-23T07:41:45.017+0000 7fba80027640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: [errno 13] RADOS permission denied (error connecting to the cluster) Nov 23 02:41:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585[29160]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Nov 23 02:41:45 localhost podman[29247]: Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.628418041 +0000 UTC m=+0.074950942 container create baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=) Nov 23 02:41:45 localhost systemd[1]: Started libpod-conmon-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope. Nov 23 02:41:45 localhost systemd[1]: Started libcrun container. Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.597223248 +0000 UTC m=+0.043756079 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.70093409 +0000 UTC m=+0.147466951 container init baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.711691253 +0000 UTC m=+0.158224084 container start baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.711978503 +0000 UTC m=+0.158511334 container attach baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Nov 23 02:41:45 localhost priceless_brahmagupta[29261]: 167 167 Nov 23 02:41:45 localhost systemd[1]: libpod-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope: Deactivated successfully. Nov 23 02:41:45 localhost podman[29247]: 2025-11-23 07:41:45.716634331 +0000 UTC m=+0.163167162 container died baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.expose-services=, release=553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 02:41:45 localhost podman[29266]: 2025-11-23 07:41:45.793411263 +0000 UTC m=+0.068931949 container remove baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_brahmagupta, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7) Nov 23 02:41:45 localhost systemd[1]: libpod-conmon-baa5f4e2f835d6f9c1149bfb031d3273bc1170c9288f966b977199fb4cd8cfcb.scope: Deactivated successfully. Nov 23 02:41:45 localhost systemd[1]: var-lib-containers-storage-overlay-c77b4887970b0551a3b6368b29506d225a364e531320f0a81721dce2e9ab5323-merged.mount: Deactivated successfully. Nov 23 02:41:45 localhost podman[29285]: Nov 23 02:41:45 localhost podman[29285]: 2025-11-23 07:41:45.992852347 +0000 UTC m=+0.068121491 container create 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:41:46 localhost systemd[1]: Started libpod-conmon-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope. Nov 23 02:41:46 localhost systemd[1]: Started libcrun container. Nov 23 02:41:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:46 localhost podman[29285]: 2025-11-23 07:41:45.964728897 +0000 UTC m=+0.039998011 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:46 localhost podman[29285]: 2025-11-23 07:41:46.101304729 +0000 UTC m=+0.176573813 container init 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, version=7, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux ) Nov 23 02:41:46 localhost podman[29285]: 2025-11-23 07:41:46.11166964 +0000 UTC m=+0.186938774 container start 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, ceph=True, version=7, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Nov 23 02:41:46 localhost podman[29285]: 2025-11-23 07:41:46.11258614 +0000 UTC m=+0.187855244 container attach 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph) Nov 23 02:41:46 localhost focused_elgamal[29301]: --> passed data devices: 0 physical, 2 LVM Nov 23 02:41:46 localhost focused_elgamal[29301]: --> relative data size: 1.0 Nov 23 02:41:46 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 23 02:41:46 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90 Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 23 02:41:47 localhost lvm[29355]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 23 02:41:47 localhost lvm[29355]: VG ceph_vg0 finished Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap Nov 23 02:41:47 localhost focused_elgamal[29301]: stderr: got monmap epoch 3 Nov 23 02:41:47 localhost focused_elgamal[29301]: --> Creating keyring file for osd.0 Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Nov 23 02:41:47 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90 --setuser ceph --setgroup ceph Nov 23 02:41:49 localhost focused_elgamal[29301]: stderr: 2025-11-23T07:41:47.752+0000 7ff72052aa80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 23 02:41:49 localhost focused_elgamal[29301]: stderr: 2025-11-23T07:41:47.752+0000 7ff72052aa80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid Nov 23 02:41:49 localhost focused_elgamal[29301]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:50 localhost focused_elgamal[29301]: --> ceph-volume lvm activate successful for osd ID: 0 Nov 23 02:41:50 localhost focused_elgamal[29301]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 445e9929-9fbe-437e-be2a-5f2d52ad535b Nov 23 02:41:50 localhost lvm[30303]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 23 02:41:50 localhost lvm[30303]: VG ceph_vg1 finished Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Nov 23 02:41:50 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap Nov 23 02:41:51 localhost focused_elgamal[29301]: stderr: got monmap epoch 3 Nov 23 02:41:51 localhost focused_elgamal[29301]: --> Creating keyring file for osd.3 Nov 23 02:41:51 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring Nov 23 02:41:51 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/ Nov 23 02:41:51 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 445e9929-9fbe-437e-be2a-5f2d52ad535b --setuser ceph --setgroup ceph Nov 23 02:41:53 localhost focused_elgamal[29301]: stderr: 2025-11-23T07:41:51.225+0000 7f84b7679a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 23 02:41:53 localhost focused_elgamal[29301]: stderr: 2025-11-23T07:41:51.227+0000 7f84b7679a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid Nov 23 02:41:53 localhost focused_elgamal[29301]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 23 02:41:53 localhost focused_elgamal[29301]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:41:53 localhost focused_elgamal[29301]: --> ceph-volume lvm activate successful for osd ID: 3 Nov 23 02:41:53 localhost focused_elgamal[29301]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Nov 23 02:41:53 localhost systemd[1]: libpod-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Deactivated successfully. Nov 23 02:41:53 localhost systemd[1]: libpod-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Consumed 3.604s CPU time. Nov 23 02:41:53 localhost podman[31218]: 2025-11-23 07:41:53.639371065 +0000 UTC m=+0.043710907 container died 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 02:41:53 localhost systemd[1]: var-lib-containers-storage-overlay-48b35e8acbad48239b8fb1ddd624c47be6f2e72f11f65ba4615a86a4cdc7df6a-merged.mount: Deactivated successfully. Nov 23 02:41:53 localhost podman[31218]: 2025-11-23 07:41:53.674160519 +0000 UTC m=+0.078500311 container remove 404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_elgamal, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Nov 23 02:41:53 localhost systemd[1]: libpod-conmon-404b282678b23f8afd0a4f03dd58d68d9a0780a47f0ece3b27bf100ca3c61dfa.scope: Deactivated successfully. Nov 23 02:41:54 localhost podman[31299]: Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.391393248 +0000 UTC m=+0.073133681 container create 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Nov 23 02:41:54 localhost systemd[1]: Started libpod-conmon-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope. Nov 23 02:41:54 localhost systemd[1]: Started libcrun container. Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.361132866 +0000 UTC m=+0.042873289 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.473246921 +0000 UTC m=+0.154987344 container init 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, version=7, name=rhceph, release=553, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.483929832 +0000 UTC m=+0.165670255 container start 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55) Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.484240443 +0000 UTC m=+0.165980876 container attach 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, version=7, release=553) Nov 23 02:41:54 localhost jolly_cori[31314]: 167 167 Nov 23 02:41:54 localhost systemd[1]: libpod-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope: Deactivated successfully. Nov 23 02:41:54 localhost podman[31299]: 2025-11-23 07:41:54.488861529 +0000 UTC m=+0.170601982 container died 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , version=7, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git) Nov 23 02:41:54 localhost podman[31319]: 2025-11-23 07:41:54.578057811 +0000 UTC m=+0.075767929 container remove 368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_cori, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, RELEASE=main, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 02:41:54 localhost systemd[1]: libpod-conmon-368e51673ea578dc77b1a1ff77b50d2c0ad4c3a589c676be4ee287724f7a0f77.scope: Deactivated successfully. Nov 23 02:41:54 localhost systemd[1]: var-lib-containers-storage-overlay-3682c77af88429b49254555f73f1b71d36e838694a62cdab207aa7c40ab69b3c-merged.mount: Deactivated successfully. Nov 23 02:41:54 localhost podman[31339]: Nov 23 02:41:54 localhost podman[31339]: 2025-11-23 07:41:54.794185879 +0000 UTC m=+0.084232025 container create c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , architecture=x86_64, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 02:41:54 localhost systemd[1]: Started libpod-conmon-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope. Nov 23 02:41:54 localhost systemd[1]: Started libcrun container. Nov 23 02:41:54 localhost podman[31339]: 2025-11-23 07:41:54.765728418 +0000 UTC m=+0.055774564 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:54 localhost podman[31339]: 2025-11-23 07:41:54.898114978 +0000 UTC m=+0.188161124 container init c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, ceph=True, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Nov 23 02:41:54 localhost podman[31339]: 2025-11-23 07:41:54.909320807 +0000 UTC m=+0.199366953 container start c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph) Nov 23 02:41:54 localhost podman[31339]: 2025-11-23 07:41:54.909536144 +0000 UTC m=+0.199582300 container attach c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., release=553, ceph=True) Nov 23 02:41:55 localhost lucid_fermi[31355]: { Nov 23 02:41:55 localhost lucid_fermi[31355]: "0": [ Nov 23 02:41:55 localhost lucid_fermi[31355]: { Nov 23 02:41:55 localhost lucid_fermi[31355]: "devices": [ Nov 23 02:41:55 localhost lucid_fermi[31355]: "/dev/loop3" Nov 23 02:41:55 localhost lucid_fermi[31355]: ], Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_name": "ceph_lv0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_size": "7511998464", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=46550e70-79cb-5f55-bf6d-1204b97e083b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_uuid": "2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa", Nov 23 02:41:55 localhost lucid_fermi[31355]: "name": "ceph_lv0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "path": "/dev/ceph_vg0/ceph_lv0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "tags": { Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.block_uuid": "2tCPCp-iGLo-2Hfx-dJfg-tPJt-XdH9-0kGXBa", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cephx_lockbox_secret": "", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cluster_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cluster_name": "ceph", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.crush_device_class": "", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.encrypted": "0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osd_fsid": "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osd_id": "0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osdspec_affinity": "default_drive_group", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.type": "block", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.vdo": "0" Nov 23 02:41:55 localhost lucid_fermi[31355]: }, Nov 23 02:41:55 localhost lucid_fermi[31355]: "type": "block", Nov 23 02:41:55 localhost lucid_fermi[31355]: "vg_name": "ceph_vg0" Nov 23 02:41:55 localhost lucid_fermi[31355]: } Nov 23 02:41:55 localhost lucid_fermi[31355]: ], Nov 23 02:41:55 localhost lucid_fermi[31355]: "3": [ Nov 23 02:41:55 localhost lucid_fermi[31355]: { Nov 23 02:41:55 localhost lucid_fermi[31355]: "devices": [ Nov 23 02:41:55 localhost lucid_fermi[31355]: "/dev/loop4" Nov 23 02:41:55 localhost lucid_fermi[31355]: ], Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_name": "ceph_lv1", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_size": "7511998464", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=46550e70-79cb-5f55-bf6d-1204b97e083b,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=445e9929-9fbe-437e-be2a-5f2d52ad535b,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "lv_uuid": "2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot", Nov 23 02:41:55 localhost lucid_fermi[31355]: "name": "ceph_lv1", Nov 23 02:41:55 localhost lucid_fermi[31355]: "path": "/dev/ceph_vg1/ceph_lv1", Nov 23 02:41:55 localhost lucid_fermi[31355]: "tags": { Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.block_uuid": "2JMRQ6-FbUg-ZwHu-8fpn-2v4n-Eves-jpm5ot", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cephx_lockbox_secret": "", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cluster_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.cluster_name": "ceph", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.crush_device_class": "", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.encrypted": "0", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osd_fsid": "445e9929-9fbe-437e-be2a-5f2d52ad535b", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osd_id": "3", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.osdspec_affinity": "default_drive_group", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.type": "block", Nov 23 02:41:55 localhost lucid_fermi[31355]: "ceph.vdo": "0" Nov 23 02:41:55 localhost lucid_fermi[31355]: }, Nov 23 02:41:55 localhost lucid_fermi[31355]: "type": "block", Nov 23 02:41:55 localhost lucid_fermi[31355]: "vg_name": "ceph_vg1" Nov 23 02:41:55 localhost lucid_fermi[31355]: } Nov 23 02:41:55 localhost lucid_fermi[31355]: ] Nov 23 02:41:55 localhost lucid_fermi[31355]: } Nov 23 02:41:55 localhost systemd[1]: libpod-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope: Deactivated successfully. Nov 23 02:41:55 localhost podman[31339]: 2025-11-23 07:41:55.287950061 +0000 UTC m=+0.577996257 container died c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 23 02:41:55 localhost podman[31364]: 2025-11-23 07:41:55.371067188 +0000 UTC m=+0.071841037 container remove c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_fermi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, version=7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55) Nov 23 02:41:55 localhost systemd[1]: libpod-conmon-c636e457b570acd61428d44ea37366fa349f9c77a9741771187e776f756fdbd6.scope: Deactivated successfully. Nov 23 02:41:55 localhost systemd[1]: var-lib-containers-storage-overlay-ad50d2a6c88a805a2e28f4361c542a29610a059cbc4c86c2313ff268dce8e7df-merged.mount: Deactivated successfully. Nov 23 02:41:56 localhost podman[31449]: Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.120210194 +0000 UTC m=+0.071457593 container create 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 02:41:56 localhost systemd[1]: Started libpod-conmon-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope. Nov 23 02:41:56 localhost systemd[1]: Started libcrun container. Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.160297308 +0000 UTC m=+0.111544717 container init 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 02:41:56 localhost upbeat_mcclintock[31465]: 167 167 Nov 23 02:41:56 localhost systemd[1]: libpod-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope: Deactivated successfully. Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.1831729 +0000 UTC m=+0.134420309 container start 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.183379417 +0000 UTC m=+0.134626846 container attach 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.18435981 +0000 UTC m=+0.135607219 container died 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, name=rhceph, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, maintainer=Guillaume Abrioux ) Nov 23 02:41:56 localhost podman[31449]: 2025-11-23 07:41:56.087930574 +0000 UTC m=+0.039178043 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:56 localhost podman[31470]: 2025-11-23 07:41:56.21661768 +0000 UTC m=+0.044448702 container remove 7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_mcclintock, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553) Nov 23 02:41:56 localhost systemd[1]: libpod-conmon-7dd0ccaeb9580b1529feb28d1a7e99dd4c453e71bc01b7ed2a9ef55c48651c46.scope: Deactivated successfully. Nov 23 02:41:56 localhost podman[31498]: Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.495155595 +0000 UTC m=+0.064406296 container create 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, version=7, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 02:41:56 localhost systemd[1]: Started libpod-conmon-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope. Nov 23 02:41:56 localhost systemd[1]: Started libcrun container. Nov 23 02:41:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.472349175 +0000 UTC m=+0.041599886 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.595497523 +0000 UTC m=+0.164748204 container init 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55) Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.60284051 +0000 UTC m=+0.172091181 container start 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , release=553, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main) Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.603061579 +0000 UTC m=+0.172312270 container attach 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True) Nov 23 02:41:56 localhost systemd[1]: var-lib-containers-storage-overlay-bf30e6521451ffeeba177399b7238bbe7bc12f36d09f657e456eb1b93cf13071-merged.mount: Deactivated successfully. Nov 23 02:41:56 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 23 02:41:56 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]: [--no-systemd] [--no-tmpfs] Nov 23 02:41:56 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test[31514]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 23 02:41:56 localhost systemd[1]: libpod-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope: Deactivated successfully. Nov 23 02:41:56 localhost podman[31498]: 2025-11-23 07:41:56.852259473 +0000 UTC m=+0.421510154 container died 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:41:56 localhost systemd[1]: var-lib-containers-storage-overlay-34590c2cf2867a858ee34e44200efd4e66f1cde22115a949d1f402a35a43d309-merged.mount: Deactivated successfully. Nov 23 02:41:56 localhost systemd-journald[618]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 23 02:41:56 localhost systemd-journald[618]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 02:41:56 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:41:56 localhost podman[31519]: 2025-11-23 07:41:56.947218599 +0000 UTC m=+0.082112133 container remove 61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12) Nov 23 02:41:56 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:41:56 localhost systemd[1]: libpod-conmon-61f80cc13180575617f33e080e60eabb797fe46b116a4152cfadb0a6584a77ad.scope: Deactivated successfully. Nov 23 02:41:57 localhost systemd[1]: Reloading. Nov 23 02:41:57 localhost systemd-rc-local-generator[31572]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:57 localhost systemd-sysv-generator[31577]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:57 localhost systemd[1]: Reloading. Nov 23 02:41:57 localhost systemd-sysv-generator[31622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:41:57 localhost systemd-rc-local-generator[31616]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:41:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:41:57 localhost systemd[1]: Starting Ceph osd.0 for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 02:41:58 localhost podman[31681]: Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.112915371 +0000 UTC m=+0.074967232 container create af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 02:41:58 localhost systemd[1]: Started libcrun container. Nov 23 02:41:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.082509405 +0000 UTC m=+0.044561296 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.209611376 +0000 UTC m=+0.171663247 container init af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.buildah.version=1.33.12, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:41:58 localhost systemd[1]: tmp-crun.3M9b1Q.mount: Deactivated successfully. Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.221169706 +0000 UTC m=+0.183221567 container start af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=) Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.221639542 +0000 UTC m=+0.183691423 container attach af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7) Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:58 localhost bash[31681]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Nov 23 02:41:58 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate[31695]: --> ceph-volume raw activate successful for osd ID: 0 Nov 23 02:41:58 localhost bash[31681]: --> ceph-volume raw activate successful for osd ID: 0 Nov 23 02:41:58 localhost systemd[1]: libpod-af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709.scope: Deactivated successfully. Nov 23 02:41:58 localhost podman[31681]: 2025-11-23 07:41:58.8612579 +0000 UTC m=+0.823309761 container died af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, release=553, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git) Nov 23 02:41:58 localhost podman[31825]: 2025-11-23 07:41:58.949913253 +0000 UTC m=+0.079642859 container remove af32320bbf72fbf58a53b35a7fae3c220e5e82a54a450bb93692afd3bdd7e709 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0-activate, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55) Nov 23 02:41:59 localhost systemd[1]: var-lib-containers-storage-overlay-dd1cb773babf13196422b20f037a81784e6431d31bc58e885fd76f4d632cc40e-merged.mount: Deactivated successfully. Nov 23 02:41:59 localhost podman[31887]: Nov 23 02:41:59 localhost podman[31887]: 2025-11-23 07:41:59.238367404 +0000 UTC m=+0.069143306 container create 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, architecture=x86_64) Nov 23 02:41:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:59 localhost podman[31887]: 2025-11-23 07:41:59.211175086 +0000 UTC m=+0.041951018 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:41:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92f9884fb100984ae55851757967c89c2119ec80cfd5b34bf7eb247f97e9c633/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Nov 23 02:41:59 localhost podman[31887]: 2025-11-23 07:41:59.352214158 +0000 UTC m=+0.182990070 container init 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main) Nov 23 02:41:59 localhost systemd[1]: tmp-crun.BqUQ9W.mount: Deactivated successfully. Nov 23 02:41:59 localhost podman[31887]: 2025-11-23 07:41:59.364163041 +0000 UTC m=+0.194938923 container start 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0, CEPH_POINT_RELEASE=, release=553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Nov 23 02:41:59 localhost bash[31887]: 50c985f9065dfe0d467655700b21e4bf1301c8ff3c74da944e20af1a8321d962 Nov 23 02:41:59 localhost systemd[1]: Started Ceph osd.0 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 02:41:59 localhost ceph-osd[31905]: set uid:gid to 167:167 (ceph:ceph) Nov 23 02:41:59 localhost ceph-osd[31905]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 23 02:41:59 localhost ceph-osd[31905]: pidfile_write: ignore empty --pid-file Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:41:59 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:41:59 localhost ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) close Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close Nov 23 02:41:59 localhost ceph-osd[31905]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal Nov 23 02:41:59 localhost ceph-osd[31905]: load: jerasure load: lrc Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:41:59 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:41:59 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close Nov 23 02:42:00 localhost podman[31998]: Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.147153861 +0000 UTC m=+0.069340473 container create ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:42:00 localhost systemd[1]: Started libpod-conmon-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope. Nov 23 02:42:00 localhost systemd[1]: Started libcrun container. Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.118973099 +0000 UTC m=+0.041159711 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.219359818 +0000 UTC m=+0.141546430 container init ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True) Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.229577283 +0000 UTC m=+0.151763885 container start ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, GIT_BRANCH=main, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., version=7, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.22976758 +0000 UTC m=+0.151954222 container attach ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph) Nov 23 02:42:00 localhost confident_newton[32013]: 167 167 Nov 23 02:42:00 localhost systemd[1]: libpod-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope: Deactivated successfully. Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:42:00 localhost podman[31998]: 2025-11-23 07:42:00.235676289 +0000 UTC m=+0.157862951 container died ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) close Nov 23 02:42:00 localhost podman[32018]: 2025-11-23 07:42:00.32276448 +0000 UTC m=+0.078609756 container remove ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_newton, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-type=git, vendor=Red Hat, Inc.) Nov 23 02:42:00 localhost systemd[1]: libpod-conmon-ad8fbe5dc2c9e4f77d9263bbcc56445b7e8999f120f8f58be5bc52cfed152c25.scope: Deactivated successfully. Nov 23 02:42:00 localhost ceph-osd[31905]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814ee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs mount Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs mount shared_bdev_used = 0 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: RocksDB version: 7.9.2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Git sha 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DB SUMMARY Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DB Session ID: F6DONMZYLUSPVQLEPLPJ Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: CURRENT file: CURRENT Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: IDENTITY file: IDENTITY Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.error_if_exists: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.create_if_missing: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.env: 0x55d7b83e2cb0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.fs: LegacyFileSystem Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.info_log: 0x55d7b90c85a0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.statistics: (nil) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_fsync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_log_file_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_fallocate: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_direct_reads: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.create_missing_column_families: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_log_dir: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_dir: db.wal Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.advise_random_on_open: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_manager: 0x55d7b8138140 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.rate_limiter: (nil) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.unordered_write: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.row_cache: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.two_write_queues: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.manual_wal_flush: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_compression: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.atomic_flush: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.log_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_host_id: __hostname__ Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_jobs: 4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_compactions: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_subcompactions: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_open_files: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_flushes: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Compression algorithms supported: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZSTD supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kXpressCompression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZlibCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8760)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8126850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b90c8980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720541017, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720541236, "job": 1, "event": "recovery_finished"} Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000 Nov 23 02:42:00 localhost ceph-osd[31905]: freelist init Nov 23 02:42:00 localhost ceph-osd[31905]: freelist _read_cfg Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs umount Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) close Nov 23 02:42:00 localhost podman[32243]: Nov 23 02:42:00 localhost podman[32243]: 2025-11-23 07:42:00.644438342 +0000 UTC m=+0.072018753 container create 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, release=553, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 02:42:00 localhost systemd[1]: Started libpod-conmon-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope. Nov 23 02:42:00 localhost systemd[1]: Started libcrun container. Nov 23 02:42:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:00 localhost podman[32243]: 2025-11-23 07:42:00.615733163 +0000 UTC m=+0.043313564 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05cef2f2a1031f49efcc570f62ec834403253b73265336b3f65d1aa8122dcc74/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:00 localhost podman[32243]: 2025-11-23 07:42:00.775429965 +0000 UTC m=+0.203010366 container init 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, vcs-type=git, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Nov 23 02:42:00 localhost ceph-osd[31905]: bdev(0x55d7b814f180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs mount Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 23 02:42:00 localhost ceph-osd[31905]: bluefs mount shared_bdev_used = 4718592 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: RocksDB version: 7.9.2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Git sha 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DB SUMMARY Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DB Session ID: F6DONMZYLUSPVQLEPLPI Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: CURRENT file: CURRENT Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: IDENTITY file: IDENTITY Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.error_if_exists: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.create_if_missing: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.env: 0x55d7b8274460 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.fs: LegacyFileSystem Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.info_log: 0x55d7b914e100 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.statistics: (nil) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_fsync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_log_file_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_fallocate: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_direct_reads: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.create_missing_column_families: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_log_dir: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_dir: db.wal Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.advise_random_on_open: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_manager: 0x55d7b81395e0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.rate_limiter: (nil) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.unordered_write: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.row_cache: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.two_write_queues: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.manual_wal_flush: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_compression: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.atomic_flush: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.log_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.db_host_id: __hostname__ Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_jobs: 4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_compactions: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_subcompactions: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_open_files: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_background_flushes: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Compression algorithms supported: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZSTD supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kXpressCompression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kZlibCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost podman[32243]: 2025-11-23 07:42:00.78950511 +0000 UTC m=+0.217085561 container start 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 02:42:00 localhost podman[32243]: 2025-11-23 07:42:00.793226756 +0000 UTC m=+0.220807227 container attach 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux ) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e2e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b81262d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8127610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8127610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.merge_operator: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d7b914e440)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55d7b8127610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression: LZ4 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.num_levels: 7 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720807330, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720813538, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720818477, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720822806, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883720, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2ad9ca8f-c9dd-48a4-9506-3c48dbbd434b", "db_session_id": "F6DONMZYLUSPVQLEPLPI", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883720826981, "job": 1, "event": "recovery_finished"} Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d7b90cc700 Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: DB pointer 0x55d7b901fa00 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4 Nov 23 02:42:00 localhost ceph-osd[31905]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 02:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 460.80 MB usag Nov 23 02:42:00 localhost ceph-osd[31905]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 23 02:42:00 localhost ceph-osd[31905]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 23 02:42:00 localhost ceph-osd[31905]: _get_class not permitted to load lua Nov 23 02:42:00 localhost ceph-osd[31905]: _get_class not permitted to load sdk Nov 23 02:42:00 localhost ceph-osd[31905]: _get_class not permitted to load test_remote_reads Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 load_pgs Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 load_pgs opened 0 pgs Nov 23 02:42:00 localhost ceph-osd[31905]: osd.0 0 log_to_monitors true Nov 23 02:42:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:00.866+0000 7ff7bc464a80 -1 osd.0 0 log_to_monitors true Nov 23 02:42:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 23 02:42:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]: [--no-systemd] [--no-tmpfs] Nov 23 02:42:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test[32260]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 23 02:42:01 localhost systemd[1]: libpod-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope: Deactivated successfully. Nov 23 02:42:01 localhost podman[32243]: 2025-11-23 07:42:01.013801714 +0000 UTC m=+0.441382135 container died 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, release=553, CEPH_POINT_RELEASE=, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, name=rhceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Nov 23 02:42:01 localhost podman[32480]: 2025-11-23 07:42:01.088782096 +0000 UTC m=+0.066499197 container remove 3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate-test, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, version=7, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=) Nov 23 02:42:01 localhost systemd[1]: libpod-conmon-3df30d7380796e84550032e97009fb9014cad7c205af477fe086a0589e793656.scope: Deactivated successfully. Nov 23 02:42:01 localhost systemd[1]: var-lib-containers-storage-overlay-3c0e124341823678b30f6dad76696bef77ca5ee037913337adc23f1cc972227e-merged.mount: Deactivated successfully. Nov 23 02:42:01 localhost systemd[1]: Reloading. Nov 23 02:42:01 localhost systemd-rc-local-generator[32533]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:42:01 localhost systemd-sysv-generator[32539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:42:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:42:01 localhost systemd[1]: Reloading. Nov 23 02:42:01 localhost systemd-rc-local-generator[32578]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:42:01 localhost systemd-sysv-generator[32583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:42:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:42:01 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 23 02:42:01 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 23 02:42:01 localhost systemd[1]: Starting Ceph osd.3 for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 02:42:02 localhost podman[32642]: Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.16007229 +0000 UTC m=+0.078013646 container create 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.expose-services=, distribution-scope=public, release=553, architecture=x86_64) Nov 23 02:42:02 localhost systemd[1]: tmp-crun.h9V2kN.mount: Deactivated successfully. Nov 23 02:42:02 localhost systemd[1]: Started libcrun container. Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.125499443 +0000 UTC m=+0.043440799 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.265979276 +0000 UTC m=+0.183920602 container init 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, vcs-type=git, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55) Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.273104567 +0000 UTC m=+0.191045923 container start 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.273448749 +0000 UTC m=+0.191390095 container attach 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, architecture=x86_64, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 done with init, starting boot process Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 start_boot Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 23 02:42:02 localhost ceph-osd[31905]: osd.0 0 bench count 12288000 bsize 4 KiB Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:42:02 localhost bash[32642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Nov 23 02:42:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate[32656]: --> ceph-volume raw activate successful for osd ID: 3 Nov 23 02:42:02 localhost bash[32642]: --> ceph-volume raw activate successful for osd ID: 3 Nov 23 02:42:02 localhost systemd[1]: libpod-2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e.scope: Deactivated successfully. Nov 23 02:42:02 localhost podman[32642]: 2025-11-23 07:42:02.953298044 +0000 UTC m=+0.871239430 container died 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:42:03 localhost podman[32777]: 2025-11-23 07:42:03.052269617 +0000 UTC m=+0.088500200 container remove 2a8b92b9ebe33dde4002a6e5d0ad2ecdc193784d76636d1895e7fc6c306bb04e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3-activate, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Nov 23 02:42:03 localhost systemd[1]: var-lib-containers-storage-overlay-cf261c130a6f49190524f3c3004c5beda5c57c727160f5a5cdb0ef15bf3ee249-merged.mount: Deactivated successfully. Nov 23 02:42:03 localhost podman[32840]: Nov 23 02:42:03 localhost podman[32840]: 2025-11-23 07:42:03.350125404 +0000 UTC m=+0.083171680 container create 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64) Nov 23 02:42:03 localhost podman[32840]: 2025-11-23 07:42:03.317651887 +0000 UTC m=+0.050698183 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375b762a0ad4c4f0bb551725c69c2ecb56d6c01a460ffc59f02bed82ff8e3d0/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:03 localhost podman[32840]: 2025-11-23 07:42:03.492837053 +0000 UTC m=+0.225883329 container init 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 02:42:03 localhost podman[32840]: 2025-11-23 07:42:03.501376641 +0000 UTC m=+0.234422907 container start 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, RELEASE=main, release=553, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 02:42:03 localhost bash[32840]: 7fdd620b28c7bda7f0d1915915aecb2acd8771c11fc19712159db34d061a4918 Nov 23 02:42:03 localhost systemd[1]: Started Ceph osd.3 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 02:42:03 localhost ceph-osd[32858]: set uid:gid to 167:167 (ceph:ceph) Nov 23 02:42:03 localhost ceph-osd[32858]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 23 02:42:03 localhost ceph-osd[32858]: pidfile_write: ignore empty --pid-file Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:03 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:03 localhost ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) close Nov 23 02:42:03 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close Nov 23 02:42:04 localhost ceph-osd[32858]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal Nov 23 02:42:04 localhost ceph-osd[32858]: load: jerasure load: lrc Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) close Nov 23 02:42:04 localhost podman[32955]: Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.249994629 +0000 UTC m=+0.066934581 container create 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, GIT_BRANCH=main, architecture=x86_64, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.) Nov 23 02:42:04 localhost systemd[1]: Started libpod-conmon-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope. Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.213549519 +0000 UTC m=+0.030489521 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:04 localhost systemd[1]: Started libcrun container. Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.330658803 +0000 UTC m=+0.147598765 container init 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, version=7, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 02:42:04 localhost heuristic_darwin[32971]: 167 167 Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.342171642 +0000 UTC m=+0.159111604 container start 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Nov 23 02:42:04 localhost systemd[1]: libpod-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope: Deactivated successfully. Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.342375189 +0000 UTC m=+0.159315151 container attach 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, architecture=x86_64, RELEASE=main, version=7, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 02:42:04 localhost podman[32955]: 2025-11-23 07:42:04.343501177 +0000 UTC m=+0.160441189 container died 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main) Nov 23 02:42:04 localhost ceph-osd[32858]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e26e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs mount Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs mount shared_bdev_used = 0 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: RocksDB version: 7.9.2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Git sha 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DB SUMMARY Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DB Session ID: F8LN5U3KUYCOQ4JP56ZW Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: CURRENT file: CURRENT Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: IDENTITY file: IDENTITY Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.error_if_exists: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.create_if_missing: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.env: 0x56377a0bacb0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.fs: LegacyFileSystem Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.info_log: 0x56377ada0300 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.statistics: (nil) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_fsync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_log_file_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_fallocate: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_direct_reads: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.create_missing_column_families: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_log_dir: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_dir: db.wal Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.advise_random_on_open: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_manager: 0x563779e10140 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.rate_limiter: (nil) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.unordered_write: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.row_cache: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.two_write_queues: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.manual_wal_flush: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_compression: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.atomic_flush: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.log_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_host_id: __hostname__ Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_jobs: 4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_compactions: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_subcompactions: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_open_files: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_flushes: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Compression algorithms supported: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZSTD supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kXpressCompression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZlibCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost systemd[1]: var-lib-containers-storage-overlay-d253d8ba80c0c9cde042619afdb1e293080e44cf6cffbd41363a62af66af008d-merged.mount: Deactivated successfully. Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada04c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada06e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f0255131-8bda-4abb-b8a5-2cf651f3fb8a Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724420048, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724420338, "job": 1, "event": "recovery_finished"} Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000 Nov 23 02:42:04 localhost ceph-osd[32858]: freelist init Nov 23 02:42:04 localhost ceph-osd[32858]: freelist _read_cfg Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs umount Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) close Nov 23 02:42:04 localhost podman[32977]: 2025-11-23 07:42:04.439926053 +0000 UTC m=+0.082817108 container remove 3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux ) Nov 23 02:42:04 localhost systemd[1]: libpod-conmon-3f6bcbf59f2c2d547e0407d2e5adda819f25bcf6af4d2b4898cfb5aa4a8b6ee0.scope: Deactivated successfully. Nov 23 02:42:04 localhost podman[33191]: Nov 23 02:42:04 localhost podman[33191]: 2025-11-23 07:42:04.620891354 +0000 UTC m=+0.055322089 container create 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, version=7, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True) Nov 23 02:42:04 localhost systemd[1]: Started libpod-conmon-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope. Nov 23 02:42:04 localhost systemd[1]: Started libcrun container. Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Nov 23 02:42:04 localhost ceph-osd[32858]: bdev(0x563779e27180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs mount Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 23 02:42:04 localhost ceph-osd[32858]: bluefs mount shared_bdev_used = 4718592 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: RocksDB version: 7.9.2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Git sha 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DB SUMMARY Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DB Session ID: F8LN5U3KUYCOQ4JP56ZX Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: CURRENT file: CURRENT Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: IDENTITY file: IDENTITY Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.error_if_exists: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.create_if_missing: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.env: 0x56377a0bbdc0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.fs: LegacyFileSystem Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.info_log: 0x56377ada1be0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.statistics: (nil) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_fsync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_log_file_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_fallocate: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_direct_reads: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.create_missing_column_families: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_log_dir: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_dir: db.wal Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.advise_random_on_open: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_manager: 0x563779e10140 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.rate_limiter: (nil) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.unordered_write: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.row_cache: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.two_write_queues: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.manual_wal_flush: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_compression: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.atomic_flush: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.log_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.db_host_id: __hostname__ Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_jobs: 4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_compactions: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_subcompactions: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_open_files: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_background_flushes: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Compression algorithms supported: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZSTD supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kXpressCompression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kZlibCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1da0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dfe2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost podman[33191]: 2025-11-23 07:42:04.594301855 +0000 UTC m=+0.028732580 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dff610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dff610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.merge_operator: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_filter_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.sst_partitioner_factory: None Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56377ada1fe0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x563779dff610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.write_buffer_size: 16777216 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number: 64 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression: LZ4 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression: Disabled Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.num_levels: 7 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.level: 32767 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.enabled: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.arena_block_size: 1048576 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_support: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.bloom_locality: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.max_successive_merges: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.force_consistency_checks: 1 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.ttl: 2592000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_files: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.min_blob_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_size: 268435456 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f0255131-8bda-4abb-b8a5-2cf651f3fb8a Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724699645, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724709952, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:04 localhost podman[33191]: 2025-11-23 07:42:04.711509953 +0000 UTC m=+0.145940688 container init 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553, vcs-type=git, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724718501, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:04 localhost podman[33191]: 2025-11-23 07:42:04.719366699 +0000 UTC m=+0.153797464 container start 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64) Nov 23 02:42:04 localhost podman[33191]: 2025-11-23 07:42:04.719623297 +0000 UTC m=+0.154054062 container attach 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724722359, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763883724, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f0255131-8bda-4abb-b8a5-2cf651f3fb8a", "db_session_id": "F8LN5U3KUYCOQ4JP56ZX", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763883724730714, "job": 1, "event": "recovery_finished"} Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563779e6a700 Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: DB pointer 0x56377acf7a00 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4 Nov 23 02:42:04 localhost ceph-osd[32858]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 02:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Nov 23 02:42:04 localhost ceph-osd[32858]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 23 02:42:04 localhost ceph-osd[32858]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 23 02:42:04 localhost ceph-osd[32858]: _get_class not permitted to load lua Nov 23 02:42:04 localhost ceph-osd[32858]: _get_class not permitted to load sdk Nov 23 02:42:04 localhost ceph-osd[32858]: _get_class not permitted to load test_remote_reads Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 load_pgs Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 load_pgs opened 0 pgs Nov 23 02:42:04 localhost ceph-osd[32858]: osd.3 0 log_to_monitors true Nov 23 02:42:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:04.784+0000 7f3b1596aa80 -1 osd.3 0 log_to_monitors true Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 35.258 iops: 9025.931 elapsed_sec: 0.332 Nov 23 02:42:04 localhost ceph-osd[31905]: log_channel(cluster) log [WRN] : OSD bench result of 9025.930813 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 0 waiting for initial osdmap Nov 23 02:42:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:04.786+0000 7ff7b8bf8640 -1 osd.0 0 waiting for initial osdmap Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 crush map has features 288514050185494528, adjusting msgr requires for clients Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 crush map has features 3314932999778484224, adjusting msgr requires for osds Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 check_osdmap_features require_osd_release unknown -> reef Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 set_numa_affinity not setting numa affinity Nov 23 02:42:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-0[31901]: 2025-11-23T07:42:04.802+0000 7ff7b3a0d640 -1 osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 23 02:42:04 localhost ceph-osd[31905]: osd.0 11 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Nov 23 02:42:05 localhost nifty_haibt[33206]: { Nov 23 02:42:05 localhost nifty_haibt[33206]: "445e9929-9fbe-437e-be2a-5f2d52ad535b": { Nov 23 02:42:05 localhost nifty_haibt[33206]: "ceph_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b", Nov 23 02:42:05 localhost nifty_haibt[33206]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Nov 23 02:42:05 localhost nifty_haibt[33206]: "osd_id": 3, Nov 23 02:42:05 localhost nifty_haibt[33206]: "osd_uuid": "445e9929-9fbe-437e-be2a-5f2d52ad535b", Nov 23 02:42:05 localhost nifty_haibt[33206]: "type": "bluestore" Nov 23 02:42:05 localhost nifty_haibt[33206]: }, Nov 23 02:42:05 localhost nifty_haibt[33206]: "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90": { Nov 23 02:42:05 localhost nifty_haibt[33206]: "ceph_fsid": "46550e70-79cb-5f55-bf6d-1204b97e083b", Nov 23 02:42:05 localhost nifty_haibt[33206]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Nov 23 02:42:05 localhost nifty_haibt[33206]: "osd_id": 0, Nov 23 02:42:05 localhost nifty_haibt[33206]: "osd_uuid": "f9b87c6d-5b03-42fe-8ffc-4e3a3ac47b90", Nov 23 02:42:05 localhost nifty_haibt[33206]: "type": "bluestore" Nov 23 02:42:05 localhost nifty_haibt[33206]: } Nov 23 02:42:05 localhost nifty_haibt[33206]: } Nov 23 02:42:05 localhost systemd[1]: libpod-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope: Deactivated successfully. Nov 23 02:42:05 localhost podman[33191]: 2025-11-23 07:42:05.274647479 +0000 UTC m=+0.709078234 container died 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12) Nov 23 02:42:05 localhost systemd[1]: tmp-crun.OpBkAI.mount: Deactivated successfully. Nov 23 02:42:05 localhost systemd[1]: var-lib-containers-storage-overlay-c5ca664a883a097413de562d27899d1b2edc6953efcd039dc6718303be60fb8f-merged.mount: Deactivated successfully. Nov 23 02:42:05 localhost podman[33458]: 2025-11-23 07:42:05.432665754 +0000 UTC m=+0.144915224 container remove 7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_haibt, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 02:42:05 localhost systemd[1]: libpod-conmon-7a7f155cebe42fe3d3cf8cf0b36f6031bc59231f4950fa9f6c894910206e38e2.scope: Deactivated successfully. Nov 23 02:42:05 localhost ceph-osd[31905]: osd.0 12 state: booting -> active Nov 23 02:42:05 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 23 02:42:05 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 done with init, starting boot process Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 start_boot Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 23 02:42:06 localhost ceph-osd[32858]: osd.3 0 bench count 12288000 bsize 4 KiB Nov 23 02:42:08 localhost ceph-osd[31905]: osd.0 14 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 23 02:42:08 localhost ceph-osd[31905]: osd.0 14 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Nov 23 02:42:08 localhost ceph-osd[31905]: osd.0 14 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 23 02:42:08 localhost podman[33584]: 2025-11-23 07:42:08.367925668 +0000 UTC m=+0.090592780 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, release=553) Nov 23 02:42:08 localhost podman[33584]: 2025-11-23 07:42:08.474055972 +0000 UTC m=+0.196723064 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 26.447 iops: 6770.493 elapsed_sec: 0.443 Nov 23 02:42:09 localhost ceph-osd[32858]: log_channel(cluster) log [WRN] : OSD bench result of 6770.493371 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 0 waiting for initial osdmap Nov 23 02:42:09 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:09.570+0000 7f3b118e9640 -1 osd.3 0 waiting for initial osdmap Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 check_osdmap_features require_osd_release unknown -> reef Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 set_numa_affinity not setting numa affinity Nov 23 02:42:09 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-osd-3[32854]: 2025-11-23T07:42:09.588+0000 7f3b0cf13640 -1 osd.3 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 15 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Nov 23 02:42:09 localhost ceph-osd[32858]: osd.3 16 state: booting -> active Nov 23 02:42:10 localhost podman[33784]: Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.215155823 +0000 UTC m=+0.072350124 container create 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, vcs-type=git, io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:42:10 localhost systemd[1]: Started libpod-conmon-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope. Nov 23 02:42:10 localhost systemd[1]: Started libcrun container. Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.283334145 +0000 UTC m=+0.140528446 container init 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_BRANCH=main, version=7, release=553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.33.12) Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.185292455 +0000 UTC m=+0.042486766 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.29533896 +0000 UTC m=+0.152533271 container start 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-type=git, version=7, GIT_CLEAN=True, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7) Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.29564728 +0000 UTC m=+0.152841591 container attach 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Nov 23 02:42:10 localhost dreamy_easley[33800]: 167 167 Nov 23 02:42:10 localhost systemd[1]: libpod-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope: Deactivated successfully. Nov 23 02:42:10 localhost podman[33784]: 2025-11-23 07:42:10.299188711 +0000 UTC m=+0.156383072 container died 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:42:10 localhost systemd[1]: var-lib-containers-storage-overlay-8cc95b34d97b5e8919b6489676deb70e99b6f013ec8fed680172a938a7859e1f-merged.mount: Deactivated successfully. Nov 23 02:42:10 localhost podman[33805]: 2025-11-23 07:42:10.469152299 +0000 UTC m=+0.158150261 container remove 7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_easley, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True) Nov 23 02:42:10 localhost systemd[1]: libpod-conmon-7e1181067317350c95af75796f0e5ac5b53ede8be2bfdff478a5e1196a8c3970.scope: Deactivated successfully. Nov 23 02:42:10 localhost podman[33826]: Nov 23 02:42:10 localhost podman[33826]: 2025-11-23 07:42:10.684243802 +0000 UTC m=+0.085273980 container create 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 02:42:10 localhost systemd[1]: Started libpod-conmon-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope. Nov 23 02:42:10 localhost systemd[1]: Started libcrun container. Nov 23 02:42:10 localhost podman[33826]: 2025-11-23 07:42:10.642817783 +0000 UTC m=+0.043847961 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 02:42:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 02:42:10 localhost podman[33826]: 2025-11-23 07:42:10.799940579 +0000 UTC m=+0.200970757 container init 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Nov 23 02:42:10 localhost podman[33826]: 2025-11-23 07:42:10.813581949 +0000 UTC m=+0.214612117 container start 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.33.12, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55) Nov 23 02:42:10 localhost podman[33826]: 2025-11-23 07:42:10.814022134 +0000 UTC m=+0.215052302 container attach 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7, RELEASE=main, maintainer=Guillaume Abrioux , release=553) Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: [ Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: { Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "available": false, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "ceph_device": false, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "lsm_data": {}, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "lvs": [], Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "path": "/dev/sr0", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "rejected_reasons": [ Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "Has a FileSystem", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "Insufficient space (<5GB)" Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: ], Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "sys_api": { Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "actuators": null, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "device_nodes": "sr0", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "human_readable_size": "482.00 KB", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "id_bus": "ata", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "model": "QEMU DVD-ROM", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "nr_requests": "2", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "partitions": {}, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "path": "/dev/sr0", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "removable": "1", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "rev": "2.5+", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "ro": "0", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "rotational": "1", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "sas_address": "", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "sas_device_handle": "", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "scheduler_mode": "mq-deadline", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "sectors": 0, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "sectorsize": "2048", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "size": 493568.0, Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "support_discard": "0", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "type": "disk", Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: "vendor": "QEMU" Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: } Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: } Nov 23 02:42:11 localhost crazy_chandrasekhar[33841]: ] Nov 23 02:42:11 localhost systemd[1]: libpod-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope: Deactivated successfully. Nov 23 02:42:11 localhost podman[33826]: 2025-11-23 07:42:11.58359845 +0000 UTC m=+0.984628678 container died 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, version=7) Nov 23 02:42:11 localhost systemd[1]: var-lib-containers-storage-overlay-892bd733081355cf3cdab8fcc30f1a667777263e53d9867e0e9a2d5b513c7016-merged.mount: Deactivated successfully. Nov 23 02:42:11 localhost podman[35138]: 2025-11-23 07:42:11.661368977 +0000 UTC m=+0.071606770 container remove 0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_chandrasekhar, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, version=7, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, distribution-scope=public) Nov 23 02:42:11 localhost systemd[1]: libpod-conmon-0d44f547522e34e731e763620110453c8e450a0b8f09fb5c5631085bb489c536.scope: Deactivated successfully. Nov 23 02:42:11 localhost ceph-osd[32858]: osd.3 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=17) [2,4,3] r=2 lpr=17 pi=[14,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 02:42:14 localhost sshd[35168]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:42:20 localhost systemd[26178]: Starting Mark boot as successful... Nov 23 02:42:20 localhost systemd[26178]: Finished Mark boot as successful. Nov 23 02:42:20 localhost podman[35269]: 2025-11-23 07:42:20.878276651 +0000 UTC m=+0.080956885 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553) Nov 23 02:42:21 localhost podman[35269]: 2025-11-23 07:42:21.014336896 +0000 UTC m=+0.217017120 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True) Nov 23 02:43:07 localhost sshd[35349]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:43:22 localhost sshd[35381]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:43:22 localhost podman[35453]: 2025-11-23 07:43:22.862127591 +0000 UTC m=+0.092373554 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 02:43:22 localhost podman[35453]: 2025-11-23 07:43:22.990397815 +0000 UTC m=+0.220643618 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, version=7, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 02:43:31 localhost systemd[1]: session-13.scope: Deactivated successfully. Nov 23 02:43:31 localhost systemd[1]: session-13.scope: Consumed 21.505s CPU time. Nov 23 02:43:31 localhost systemd-logind[761]: Session 13 logged out. Waiting for processes to exit. Nov 23 02:43:31 localhost systemd-logind[761]: Removed session 13. Nov 23 02:43:48 localhost sshd[35595]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:44:46 localhost sshd[35672]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:44:53 localhost sshd[35674]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:44:58 localhost sshd[35676]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:45:36 localhost systemd[26178]: Created slice User Background Tasks Slice. Nov 23 02:45:36 localhost systemd[26178]: Starting Cleanup of User's Temporary Files and Directories... Nov 23 02:45:36 localhost systemd[26178]: Finished Cleanup of User's Temporary Files and Directories. Nov 23 02:46:21 localhost sshd[35755]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:46:32 localhost sshd[35835]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:46:56 localhost sshd[35837]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:46:56 localhost systemd-logind[761]: New session 27 of user zuul. Nov 23 02:46:56 localhost systemd[1]: Started Session 27 of User zuul. Nov 23 02:46:56 localhost python3[35885]: ansible-ansible.legacy.ping Invoked with data=pong Nov 23 02:46:57 localhost python3[35930]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 02:46:58 localhost python3[35950]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 23 02:46:58 localhost python3[36006]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:46:58 localhost python3[36049]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763884018.3521554-66724-13791909958476/source _original_basename=tmpq65hmubd follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:46:59 localhost python3[36079]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:46:59 localhost python3[36095]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:00 localhost python3[36111]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:00 localhost python3[36127]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUOGMsJ/AVVgY0xZSxHG1Oo8GuV12NJwOZUmjQnvkVRDmS001A30AXXR0N8dst9ZQnVZ7t0kBrbCnuEI8SNNpeziiyScPCPK73L2zyV8Js+qKXswkPpolOCRy92kOph1OZYuXdhUodUSdBoJ4mwf2s5nJhAJmH6XvfiqHUaCRd9Gp9NgU9FvjG41eO7BwkjRpKTg2jZAy21PGLDeWxRI5qxEpDgdXeW0riuuVHj1FGKKfC1wAe7xB5wykXcRkuog4VlSx2/V+mPpSMDZ1POsAxKOAMYkxfj+qoDIBfDc0R1cbxFehgmCHc8a4z+IjP5eiUvX3HjeV7ZBTR5hkYKHAJfeU6Cj5HQsTwwJrc+oHuosokgJ/ct0+WpvqhalUoL8dpoLUY6PQq+5CeOJrpZeLzXZTIPLWTA4jbbkHa/SwmAk07+hpxpFz3NhSfpT4GfOgKnowPfo+3mJMDAetTMZpizTdfPfc13gl7Zyqb9cB8lgx1IVzN6ZrxPyvPqj05uPk= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:01 localhost python3[36141]: ansible-ping Invoked with data=pong Nov 23 02:47:12 localhost sshd[36144]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:47:12 localhost systemd-logind[761]: New session 28 of user tripleo-admin. Nov 23 02:47:12 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 23 02:47:12 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 23 02:47:12 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 23 02:47:12 localhost systemd[1]: Starting User Manager for UID 1003... Nov 23 02:47:12 localhost systemd[36148]: Queued start job for default target Main User Target. Nov 23 02:47:12 localhost systemd[36148]: Created slice User Application Slice. Nov 23 02:47:12 localhost systemd[36148]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 02:47:12 localhost systemd[36148]: Started Daily Cleanup of User's Temporary Directories. Nov 23 02:47:12 localhost systemd[36148]: Reached target Paths. Nov 23 02:47:12 localhost systemd[36148]: Reached target Timers. Nov 23 02:47:12 localhost systemd[36148]: Starting D-Bus User Message Bus Socket... Nov 23 02:47:12 localhost systemd[36148]: Starting Create User's Volatile Files and Directories... Nov 23 02:47:12 localhost systemd[36148]: Listening on D-Bus User Message Bus Socket. Nov 23 02:47:12 localhost systemd[36148]: Finished Create User's Volatile Files and Directories. Nov 23 02:47:12 localhost systemd[36148]: Reached target Sockets. Nov 23 02:47:12 localhost systemd[36148]: Reached target Basic System. Nov 23 02:47:12 localhost systemd[36148]: Reached target Main User Target. Nov 23 02:47:12 localhost systemd[36148]: Startup finished in 117ms. Nov 23 02:47:12 localhost systemd[1]: Started User Manager for UID 1003. Nov 23 02:47:12 localhost systemd[1]: Started Session 28 of User tripleo-admin. Nov 23 02:47:13 localhost python3[36210]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 23 02:47:18 localhost python3[36230]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Nov 23 02:47:19 localhost python3[36246]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 23 02:47:19 localhost python3[36294]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.g4zi971qtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:20 localhost python3[36324]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.g4zi971qtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:21 localhost python3[36340]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.g4zi971qtmphosts insertbefore=BOF block=172.17.0.106 np0005532584.localdomain np0005532584#012172.18.0.106 np0005532584.storage.localdomain np0005532584.storage#012172.20.0.106 np0005532584.storagemgmt.localdomain np0005532584.storagemgmt#012172.17.0.106 np0005532584.internalapi.localdomain np0005532584.internalapi#012172.19.0.106 np0005532584.tenant.localdomain np0005532584.tenant#012192.168.122.106 np0005532584.ctlplane.localdomain np0005532584.ctlplane#012172.17.0.107 np0005532585.localdomain np0005532585#012172.18.0.107 np0005532585.storage.localdomain np0005532585.storage#012172.20.0.107 np0005532585.storagemgmt.localdomain np0005532585.storagemgmt#012172.17.0.107 np0005532585.internalapi.localdomain np0005532585.internalapi#012172.19.0.107 np0005532585.tenant.localdomain np0005532585.tenant#012192.168.122.107 np0005532585.ctlplane.localdomain np0005532585.ctlplane#012172.17.0.108 np0005532586.localdomain np0005532586#012172.18.0.108 np0005532586.storage.localdomain np0005532586.storage#012172.20.0.108 np0005532586.storagemgmt.localdomain np0005532586.storagemgmt#012172.17.0.108 np0005532586.internalapi.localdomain np0005532586.internalapi#012172.19.0.108 np0005532586.tenant.localdomain np0005532586.tenant#012192.168.122.108 np0005532586.ctlplane.localdomain np0005532586.ctlplane#012172.17.0.103 np0005532581.localdomain np0005532581#012172.18.0.103 np0005532581.storage.localdomain np0005532581.storage#012172.20.0.103 np0005532581.storagemgmt.localdomain np0005532581.storagemgmt#012172.17.0.103 np0005532581.internalapi.localdomain np0005532581.internalapi#012172.19.0.103 np0005532581.tenant.localdomain np0005532581.tenant#012192.168.122.103 np0005532581.ctlplane.localdomain np0005532581.ctlplane#012172.17.0.104 np0005532582.localdomain np0005532582#012172.18.0.104 np0005532582.storage.localdomain np0005532582.storage#012172.20.0.104 np0005532582.storagemgmt.localdomain np0005532582.storagemgmt#012172.17.0.104 np0005532582.internalapi.localdomain np0005532582.internalapi#012172.19.0.104 np0005532582.tenant.localdomain np0005532582.tenant#012192.168.122.104 np0005532582.ctlplane.localdomain np0005532582.ctlplane#012172.17.0.105 np0005532583.localdomain np0005532583#012172.18.0.105 np0005532583.storage.localdomain np0005532583.storage#012172.20.0.105 np0005532583.storagemgmt.localdomain np0005532583.storagemgmt#012172.17.0.105 np0005532583.internalapi.localdomain np0005532583.internalapi#012172.19.0.105 np0005532583.tenant.localdomain np0005532583.tenant#012192.168.122.105 np0005532583.ctlplane.localdomain np0005532583.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.204 overcloud.storage.localdomain#012172.20.0.141 overcloud.storagemgmt.localdomain#012172.17.0.224 overcloud.internalapi.localdomain#012172.21.0.154 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:22 localhost python3[36356]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.g4zi971qtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:47:22 localhost python3[36373]: ansible-file Invoked with path=/tmp/ansible.g4zi971qtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:47:23 localhost python3[36389]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:47:24 localhost python3[36406]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:47:28 localhost python3[36426]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:47:29 localhost python3[36472]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:47:57 localhost sshd[37265]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:48:04 localhost sshd[37299]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:48:08 localhost sshd[37322]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:48:40 localhost kernel: SELinux: Converting 2699 SID table entries... Nov 23 02:48:40 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:48:40 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:48:40 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:48:40 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:48:40 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:48:40 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:48:40 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:48:40 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=6 res=1 Nov 23 02:48:40 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:48:40 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:48:40 localhost systemd[1]: Reloading. Nov 23 02:48:40 localhost systemd-rc-local-generator[37668]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:48:40 localhost systemd-sysv-generator[37671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:48:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:48:41 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:48:41 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:48:41 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:48:41 localhost systemd[1]: run-r6b924f3f86914144a6454e657bd0b87d.service: Deactivated successfully. Nov 23 02:48:42 localhost python3[38110]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:44 localhost python3[38249]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:48:44 localhost systemd[1]: Reloading. Nov 23 02:48:44 localhost systemd-rc-local-generator[38275]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:48:44 localhost systemd-sysv-generator[38280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:48:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:48:45 localhost python3[38303]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:46 localhost python3[38319]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:47 localhost python3[38336]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 02:48:47 localhost python3[38354]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:48 localhost python3[38372]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:48 localhost python3[38390]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:48:48 localhost systemd[1]: Reloading Network Manager... Nov 23 02:48:48 localhost NetworkManager[5975]: [1763884128.7652] audit: op="reload" arg="0" pid=38393 uid=0 result="success" Nov 23 02:48:48 localhost NetworkManager[5975]: [1763884128.7661] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Nov 23 02:48:48 localhost NetworkManager[5975]: [1763884128.7662] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Nov 23 02:48:48 localhost systemd[1]: Reloaded Network Manager. Nov 23 02:48:50 localhost python3[38409]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:50 localhost python3[38426]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:48:50 localhost python3[38444]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:48:51 localhost python3[38460]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:52 localhost python3[38476]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 23 02:48:52 localhost python3[38492]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:48:53 localhost python3[38508]: ansible-blockinfile Invoked with path=/tmp/ansible.wmck7r26 block=[192.168.122.106]*,[np0005532584.ctlplane.localdomain]*,[172.17.0.106]*,[np0005532584.internalapi.localdomain]*,[172.18.0.106]*,[np0005532584.storage.localdomain]*,[172.20.0.106]*,[np0005532584.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005532584.tenant.localdomain]*,[np0005532584.localdomain]*,[np0005532584]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3OrbPXlomvlluk5pGQwXwJu+cR1IMLHg5EnGcI5epB1SB6q/EzlEo5+bOYmmvILsoesUzBIBq21mRhn1Wi2yjlys0pArFDqiLkUBvTW9ro6MKci9Smc12m7AkLus6UO6h3pzqcOdRZQ3KOQDL/83yYJVBCJyqlISXWzzHJpGRVnZHeT4CgKZ1nG5UEvOrtPXRAVWkz3v5TghJrYXvWaPQPmWcEy1rfhCjkCfQY++JB/Dlgammmd1+ZldadeXQi1b2X02a6GFyW0pUMFLjAP7Wr+KcRa5FIPmGwsPuc1NhveAH6zyLrabrh7jPR5O0tBjz9KcNYXbQmJetGt9ZWzFsl0qzXrvI38q5RlGptbqg0iSez61VBAUtnfs33hnYc3dvzJKXReR76PoU3yu/tLrhdK6szqIVsMdw2LGEro7l3KKMKXHSpi8n77fH8ICiU3F5Oif+nvS/e7xr4LccSEnFEHA9PdNxOWxJYLcxTQCt3BkNFrWw4oB1LiDsn98HlS8=#012[192.168.122.107]*,[np0005532585.ctlplane.localdomain]*,[172.17.0.107]*,[np0005532585.internalapi.localdomain]*,[172.18.0.107]*,[np0005532585.storage.localdomain]*,[172.20.0.107]*,[np0005532585.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005532585.tenant.localdomain]*,[np0005532585.localdomain]*,[np0005532585]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU6ocW8HWtJJyWPSFUqcN5z70XYnNrE5KeWh/VJ4bDkpVePpxxcdD8r8cKL121q0MKPRgia3jLqnKz+o4MH3AqTAWCZamBc1+ePq9OvZDenK69byea8TM176uYzfePjNlud4LSZ6lfkgneO5jeNE6/RcHgBc8Me+2mlzpavioA814r6Ci6hFaEIOS1Zd2b/yKzI4QRl6xg/aJKvlIe9w3G3BvKOG5pixPx2ng4wYc0OMtJb9ItJgZLY92GGuvVRwn9e0D4lab84+x/Nn3XatQdqU69ev7da/bQCUeBivyEZo03olh56YxCKvNfG3ZYwwhMTn9Hg/EdnwrGHYHj0ZgfSR1+Dzvnk0WW/MRs0276Ojj5O0hhnlaAh5n97W6fgHldGKvdEafYeD602C1Zkd+ISqF13W56MWhtUhiUsdUHShnpM/EBOITg6mTDFP1i/qMS0PjRaCzBpdqpJIoKzQpsi4Z3QTHTZ7uK/lqOEaE/wqXHuYlMKcTuOuX33gIp28k=#012[192.168.122.108]*,[np0005532586.ctlplane.localdomain]*,[172.17.0.108]*,[np0005532586.internalapi.localdomain]*,[172.18.0.108]*,[np0005532586.storage.localdomain]*,[172.20.0.108]*,[np0005532586.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005532586.tenant.localdomain]*,[np0005532586.localdomain]*,[np0005532586]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD6U4JggC29IKqxQ7GjhK23AehQb1S2zLryOxLwLEs9rP0qOZpJ9wR1VsBNLXDCmoRVTsH2+3V00hmkvlanKUuzgmLO61hdur+5NQD0xHnY7lOLpOoyR7hJiMuHj/nRgBLWY2OB8Gim121dgfuc2zRF92igDYe65Uf0et83vWlgRmc7KlziaJ91iVcBUmhGYf3Ij7QxfhQH5TTnGoQizdiBpuP+yVuU2AepbvQ8ZFvzioCwzWAVu/xfdRFp9QyLT4JP1jM6dadTjD5RUAjRL6qR1tLXVq/rvqtXSL8ruBSYm3NCOys9RtdrNolZ7frd+zmvF+VzMNLtlRxiuy1ReR+ZO3felB+4TwfEfLZ+DqE1s3+ksCQH/sVCrxzFsRz5lamWG3p78ZBWTiQ/7WdJS1dQOHz+pKNSSW/NYMIqitxsCsEWPJLq/EWoHVxvjREucCb5YvWHPKOv5RLlbm5lSHFLuFVV8O3AAzD/3JsjTbKGOjJhmtxPCgEy7RPqtIUX90s=#012[192.168.122.103]*,[np0005532581.ctlplane.localdomain]*,[172.17.0.103]*,[np0005532581.internalapi.localdomain]*,[172.18.0.103]*,[np0005532581.storage.localdomain]*,[172.20.0.103]*,[np0005532581.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005532581.tenant.localdomain]*,[np0005532581.localdomain]*,[np0005532581]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRibSMIP5+E9lJWuaKDEuCaJoGhGPTqff+o8SP2Twk+NhPOa5FC7WQhHPLXVhKAtlCX60ckYE53Q/H/RVRZ55JdWQLSdY/1tQCD6c0Ry6N+UD+mxo9iN9cHk6vd6J5kJu+v/gBEmFY1A9pjzsD1CTR8gZJHZFqbUTzXrKkoUjK3Kqa8UtvzyhgYQtYIaUwaf1z7CMNQ3A4EaGVKyRsVwb11jlaT9fjB43E3tp9p5EG6PPJEGux/Xea6iHnhSwZHpkD/ylneDOkBbGvYKhL33bpXMcbuHy32jAFr+2Q07sKvgy/b5/f/nTgNCyxEIpoXUbEhX+Vlh+gycU7KJw6FRyR3dQFjooV97NQ/oov2VP9DnTObziZA8lhaJ20ChTfDVUyvFCFi3dKgBUPCeNWCGI69eNHu3dQcwCNJ3kANqhHdkYpBd00PVBritJfxfzH1DCLo0I9CSi1buWYhein9VHZWtzePv/+ucWERRIo+J04QPkV+6P6vgOTRl5U75RctJU=#012[192.168.122.104]*,[np0005532582.ctlplane.localdomain]*,[172.17.0.104]*,[np0005532582.internalapi.localdomain]*,[172.18.0.104]*,[np0005532582.storage.localdomain]*,[172.20.0.104]*,[np0005532582.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005532582.tenant.localdomain]*,[np0005532582.localdomain]*,[np0005532582]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0v47OVdr7YS/5xSUmMc7u26O7OwPomkdDR6s8rrcencbx7seRSeU00QGeRQcJJ023bD3xk26W8iiJTRUDkYSy//cSfHODdDy+CNEfDUTkGzIjiApoLi2b+S4J6wcAldMsj02MZmx67vUHyM5Qwok+22XqopryL8BiGPJbnoUcZy773f5OKPPMNuj3Fyb7jd5mrC7awK4NniZHyHPYBQeBa234HL42fRjcOqCcxuauy5cbz9PeBv5/kg+nYc8cY5qCyLqNhzMVRUa/PcepMBcfThk17LtPGzCYS7IR2cGdUDP6Pe0QD34Hu6+mpwKwYx73v5uHcmy9CeZ8fK83/F84Lr6jxsiwoU2e+hUfzVRq8gnkjk6kuL86eSM2POSGgBYYgCb+Ma6lOkF1MA+rLAh0gAsUhBgVlz6HtaMoDvLOi/NrQeoQyNE1Pv4vPAndmGGc8A7JCtmCMk9VvMy0Ht4IOvtDJFfx1lg7NuMIKqePYTEk56p8wTUNM+BmdJEhFPU=#012[192.168.122.105]*,[np0005532583.ctlplane.localdomain]*,[172.17.0.105]*,[np0005532583.internalapi.localdomain]*,[172.18.0.105]*,[np0005532583.storage.localdomain]*,[172.20.0.105]*,[np0005532583.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005532583.tenant.localdomain]*,[np0005532583.localdomain]*,[np0005532583]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkB1Cq8AQaEBYTlv5Hzs024jg//D6wieNnvsI5WcYj7wckm9vKTJQfUD6yZBMmyPw6+vVzsM16bj2hagkDR5wkO7uSIaMqWrcoQ1h9HkJQLK8QB0iuzUvQzdr22kUgkLII8thNHK4VxF4VhAKNmzqCofZ4ZSaLUMwauFCFUjx1VJISEZdgYRZ4+++wAN5bdK+WrwSOAHJYJWQX2pRRsPiunSdY1BOUKB3sp7IBcQ3MDJgnKlkR7tiGSYB2W8JsLvIsIb0I2EaqmPUTIzKUuxSJnWEls/WyDT9MNkjhobVeAyFZ5TEik4OvobUhVGJ8CsU7O101KQNQ3IywPM+V0UpjA1yK49z5Qs0LjApmqORsTcjOojYaKGr9n64dVjXdFOMwajB9UmMEFtlIngm6kx7mJQGXqYxVAscW34JY832iKOEzQWrUSdo6mVJ7TXhYYcbdFp+G/128SfhNrbHwKinHeE9Nqu48BR7bmRZXO7ef+UMY1dG3AIvFt4JwFvLihZc=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:54 localhost python3[38524]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wmck7r26' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:54 localhost python3[38542]: ansible-file Invoked with path=/tmp/ansible.wmck7r26 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:48:55 localhost python3[38558]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 02:48:55 localhost python3[38574]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:55 localhost python3[38592]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:56 localhost python3[38611]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Nov 23 02:48:59 localhost python3[38748]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:48:59 localhost python3[38765]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:49:02 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:49:02 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:49:02 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:49:02 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:49:02 localhost systemd[1]: Reloading. Nov 23 02:49:02 localhost systemd-rc-local-generator[38836]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:49:02 localhost systemd-sysv-generator[38839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:49:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:49:02 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:49:02 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 23 02:49:03 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 23 02:49:03 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 23 02:49:03 localhost systemd[1]: tuned.service: Consumed 1.621s CPU time. Nov 23 02:49:03 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 23 02:49:03 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:49:03 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:49:03 localhost systemd[1]: run-r66937b8c3a3b4851987bc97de40f15d9.service: Deactivated successfully. Nov 23 02:49:04 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 23 02:49:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:49:04 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:49:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:49:04 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:49:04 localhost systemd[1]: run-r242fc32500e148adbd6fc9f0ea9df66d.service: Deactivated successfully. Nov 23 02:49:05 localhost python3[39202]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:49:05 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 23 02:49:05 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 23 02:49:05 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 23 02:49:05 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 23 02:49:07 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 23 02:49:07 localhost python3[39397]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:08 localhost python3[39414]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 23 02:49:08 localhost python3[39430]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:49:09 localhost python3[39446]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:11 localhost python3[39466]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:11 localhost python3[39483]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:49:14 localhost python3[39499]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:18 localhost python3[39515]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:19 localhost python3[39563]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:19 localhost python3[39608]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884159.114268-71272-49181084642236/source _original_basename=tmp7opzimul follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:20 localhost python3[39638]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:21 localhost python3[39686]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:21 localhost python3[39729]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884160.7282596-71412-171025363337915/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=72c5ef7909b5cdbbb2310fa1b5c8d166a17f7155 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:21 localhost python3[39791]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:22 localhost python3[39834]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884161.588198-71471-110575407979223/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=6552073e0e4bb04b7faeda3f8c2098edf889171a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:22 localhost python3[39896]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:23 localhost python3[39939]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884162.4989095-71471-97080044522266/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=1bc51567bc68ec6d87ea2fcfee756b886ebb9f92 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:23 localhost python3[40001]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:24 localhost python3[40044]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884163.3904185-71471-175239504976019/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=175c760950d63a47f443f25b58088dba962f090b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:24 localhost python3[40106]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:25 localhost python3[40149]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884164.3886087-71471-256168401867334/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:25 localhost python3[40211]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:25 localhost python3[40254]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884165.2767217-71471-183434796375795/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=56e83bd8f0316b152a1db4641581399e13c698c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:26 localhost python3[40316]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:26 localhost python3[40359]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884166.159478-71471-228978687949099/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:27 localhost python3[40421]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:27 localhost python3[40464]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884167.0312788-71471-165341677504853/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=66f0a2c6a0832caadadc4d66bd975147c152464b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:28 localhost python3[40526]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:28 localhost python3[40569]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884167.902713-71471-161681992095249/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:29 localhost python3[40631]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:29 localhost python3[40674]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884168.7701542-71471-54531224539471/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:29 localhost python3[40736]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:30 localhost python3[40779]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884169.5790863-71471-203678503565073/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=3376f53556298731e6a35da8f5186b37e7a2bd16 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:31 localhost python3[40809]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:49:31 localhost python3[40857]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:49:32 localhost python3[40930]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884171.5423822-72256-205073522694703/source _original_basename=tmpyycxxdqj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:49:36 localhost systemd[36148]: Starting Mark boot as successful... Nov 23 02:49:36 localhost systemd[36148]: Finished Mark boot as successful. Nov 23 02:49:37 localhost python3[41008]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 02:49:37 localhost python3[41070]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:41 localhost python3[41087]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:47 localhost python3[41104]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:47 localhost python3[41127]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:52 localhost python3[41144]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:52 localhost python3[41167]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:49:58 localhost python3[41184]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:02 localhost sshd[41186]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:50:02 localhost python3[41203]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:03 localhost python3[41226]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:07 localhost python3[41243]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:12 localhost python3[41260]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:12 localhost python3[41283]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:17 localhost python3[41300]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:21 localhost python3[41317]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:22 localhost python3[41340]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:26 localhost python3[41357]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:31 localhost python3[41374]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:32 localhost python3[41422]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:32 localhost python3[41440]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp_zu7kv7y recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:32 localhost python3[41470]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:33 localhost python3[41532]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:33 localhost python3[41566]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:34 localhost python3[41660]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:34 localhost python3[41678]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:35 localhost python3[41740]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:35 localhost python3[41758]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:35 localhost python3[41820]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:36 localhost python3[41838]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:36 localhost python3[41915]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:36 localhost python3[41933]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:37 localhost python3[41995]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:37 localhost python3[42013]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:38 localhost python3[42075]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:38 localhost python3[42093]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:38 localhost python3[42155]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:39 localhost python3[42173]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:39 localhost python3[42235]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:39 localhost python3[42253]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:40 localhost python3[42315]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:40 localhost python3[42333]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:40 localhost python3[42395]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:41 localhost python3[42413]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:41 localhost python3[42443]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:50:42 localhost python3[42491]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:42 localhost python3[42509]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmp2iguvxzb recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:45 localhost python3[42539]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:50:50 localhost python3[42556]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:50:52 localhost python3[42574]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:50:52 localhost python3[42592]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:50:53 localhost systemd[1]: Reloading. Nov 23 02:50:53 localhost systemd-rc-local-generator[42620]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:50:53 localhost systemd-sysv-generator[42625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:50:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:50:54 localhost systemd[1]: Starting Netfilter Tables... Nov 23 02:50:54 localhost systemd[1]: Finished Netfilter Tables. Nov 23 02:50:54 localhost python3[42682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:55 localhost python3[42725]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884254.5923202-75169-249923330290417/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:55 localhost python3[42755]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:56 localhost python3[42773]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:50:56 localhost python3[42822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:57 localhost python3[42865]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884256.2745795-75274-236542045680377/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:57 localhost python3[42927]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:58 localhost python3[42970]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884257.2028265-75398-127407320322808/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:58 localhost python3[43032]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:58 localhost python3[43075]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884258.217897-75452-28217145799119/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:50:59 localhost python3[43137]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:50:59 localhost python3[43180]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884259.1261356-75514-78615101295627/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:00 localhost python3[43242]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:01 localhost python3[43285]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884260.0139656-75551-95252839979549/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:01 localhost python3[43315]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:02 localhost python3[43380]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:02 localhost python3[43397]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:03 localhost python3[43414]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:03 localhost python3[43433]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:03 localhost python3[43449]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:04 localhost python3[43465]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:04 localhost python3[43481]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 23 02:51:05 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=7 res=1 Nov 23 02:51:05 localhost python3[43505]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 23 02:51:06 localhost kernel: SELinux: Converting 2703 SID table entries... Nov 23 02:51:06 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:51:06 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:51:06 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:51:06 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:51:06 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:51:06 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:51:06 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:51:06 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=8 res=1 Nov 23 02:51:07 localhost python3[43526]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 23 02:51:07 localhost kernel: SELinux: Converting 2703 SID table entries... Nov 23 02:51:07 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:51:07 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:51:07 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:51:07 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:51:07 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:51:07 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:51:07 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:51:08 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=9 res=1 Nov 23 02:51:08 localhost python3[43547]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 23 02:51:09 localhost kernel: SELinux: Converting 2703 SID table entries... Nov 23 02:51:09 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:51:09 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:51:09 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:51:09 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:51:09 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:51:09 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:51:09 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:51:09 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=10 res=1 Nov 23 02:51:09 localhost python3[43568]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:09 localhost python3[43584]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:10 localhost python3[43600]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:10 localhost python3[43616]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:51:11 localhost python3[43632]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:12 localhost python3[43649]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:51:15 localhost python3[43666]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:16 localhost python3[43714]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:17 localhost python3[43757]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884275.7861643-76415-129523325808288/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:17 localhost python3[43787]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:51:17 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 23 02:51:17 localhost systemd[1]: Stopped Load Kernel Modules. Nov 23 02:51:17 localhost systemd[1]: Stopping Load Kernel Modules... Nov 23 02:51:17 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 02:51:17 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Nov 23 02:51:17 localhost kernel: Bridge firewalling registered Nov 23 02:51:17 localhost systemd-modules-load[43790]: Inserted module 'br_netfilter' Nov 23 02:51:17 localhost systemd-modules-load[43790]: Module 'msr' is built in Nov 23 02:51:17 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 02:51:18 localhost python3[43841]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:18 localhost python3[43884]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884277.8322845-76454-243313197577992/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:18 localhost python3[43914]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:19 localhost python3[43931]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:19 localhost python3[43949]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:19 localhost python3[43967]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:20 localhost python3[43984]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:21 localhost python3[44001]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:21 localhost python3[44018]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:22 localhost python3[44036]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:22 localhost python3[44054]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:22 localhost python3[44072]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:22 localhost python3[44090]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:23 localhost python3[44108]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:23 localhost python3[44126]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:23 localhost python3[44144]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:24 localhost python3[44161]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:24 localhost python3[44178]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:24 localhost python3[44195]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:25 localhost python3[44212]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 23 02:51:25 localhost python3[44230]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 02:51:25 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 23 02:51:25 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 23 02:51:25 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 23 02:51:25 localhost systemd[1]: Starting Apply Kernel Variables... Nov 23 02:51:25 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 23 02:51:25 localhost systemd[1]: Finished Apply Kernel Variables. Nov 23 02:51:26 localhost python3[44250]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:26 localhost python3[44266]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:26 localhost python3[44282]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:27 localhost python3[44298]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:51:27 localhost python3[44314]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:27 localhost python3[44330]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:28 localhost python3[44346]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:28 localhost python3[44362]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:28 localhost python3[44378]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:29 localhost python3[44426]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:29 localhost python3[44469]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884288.820977-76946-155105315225353/source _original_basename=tmpvxoavkrd follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:29 localhost python3[44499]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:31 localhost python3[44516]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:31 localhost python3[44564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:31 localhost python3[44607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884291.2656968-77142-104627080339130/source _original_basename=tmpqckmq4c3 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:32 localhost python3[44637]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:32 localhost python3[44653]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:33 localhost python3[44669]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:33 localhost python3[44685]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:33 localhost python3[44701]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:33 localhost python3[44717]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:34 localhost python3[44733]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:34 localhost python3[44749]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:34 localhost python3[44765]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:35 localhost python3[44781]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Nov 23 02:51:35 localhost python3[44803]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 23 02:51:36 localhost python3[44827]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Nov 23 02:51:36 localhost python3[44843]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:36 localhost python3[44929]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:37 localhost python3[45015]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884296.56821-77440-253364882993894/source _original_basename=tmpf9uzpquc follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:37 localhost python3[45062]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 23 02:51:38 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=11 res=1 Nov 23 02:51:38 localhost python3[45112]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:38 localhost python3[45128]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:39 localhost python3[45144]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Nov 23 02:51:40 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=12 res=1 Nov 23 02:51:40 localhost python3[45165]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:51:41 localhost sshd[45167]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:51:44 localhost python3[45184]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 02:51:44 localhost python3[45245]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:45 localhost python3[45261]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:45 localhost python3[45322]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:46 localhost python3[45365]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884305.2939782-77877-149074396637874/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=abd00f17d9c4e4815bd9c520c8599e87c7741b66 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:46 localhost python3[45427]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:46 localhost python3[45472]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884306.2207127-77927-223311503495547/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:47 localhost python3[45502]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:47 localhost python3[45518]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:47 localhost python3[45534]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:48 localhost python3[45550]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:48 localhost python3[45598]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:49 localhost python3[45641]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884308.6139896-78040-161620703582448/source _original_basename=tmpvditmsu0 follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:49 localhost python3[45671]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:50 localhost python3[45687]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:51:50 localhost python3[45703]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:51:54 localhost python3[45752]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:51:54 localhost python3[45797]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884313.8971488-78327-127124705590039/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:51:55 localhost python3[45828]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:51:55 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 23 02:51:55 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 23 02:51:55 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 23 02:51:55 localhost systemd[1]: sshd.service: Consumed 4.842s CPU time, read 2.5M from disk, written 216.0K to disk. Nov 23 02:51:55 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 23 02:51:55 localhost systemd[1]: Stopping sshd-keygen.target... Nov 23 02:51:55 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 02:51:55 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 02:51:55 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 02:51:55 localhost systemd[1]: Reached target sshd-keygen.target. Nov 23 02:51:55 localhost systemd[1]: Starting OpenSSH server daemon... Nov 23 02:51:55 localhost sshd[45832]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:51:55 localhost systemd[1]: Started OpenSSH server daemon. Nov 23 02:51:55 localhost python3[45848]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:56 localhost python3[45866]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:51:57 localhost python3[45884]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:52:00 localhost python3[45933]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:00 localhost python3[45951]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 02:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s#012Interval WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Nov 23 02:52:01 localhost python3[45981]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:52:02 localhost python3[46031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:02 localhost python3[46049]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:02 localhost python3[46079]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:52:02 localhost systemd[1]: Reloading. Nov 23 02:52:03 localhost systemd-rc-local-generator[46105]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:52:03 localhost systemd-sysv-generator[46109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:52:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:52:03 localhost systemd[1]: Starting chronyd online sources service... Nov 23 02:52:03 localhost chronyc[46119]: 200 OK Nov 23 02:52:03 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 23 02:52:03 localhost systemd[1]: Finished chronyd online sources service. Nov 23 02:52:03 localhost python3[46135]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:03 localhost chronyd[25967]: System clock was stepped by 0.000009 seconds Nov 23 02:52:04 localhost python3[46152]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:04 localhost python3[46169]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:04 localhost chronyd[25967]: System clock was stepped by -0.000000 seconds Nov 23 02:52:04 localhost python3[46186]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 02:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s#012Interval WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable Nov 23 02:52:05 localhost python3[46203]: ansible-timezone Invoked with name=UTC hwclock=None Nov 23 02:52:05 localhost systemd[1]: Starting Time & Date Service... Nov 23 02:52:05 localhost systemd[1]: Started Time & Date Service. Nov 23 02:52:06 localhost python3[46223]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:07 localhost python3[46240]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:07 localhost python3[46257]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 23 02:52:08 localhost python3[46273]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:52:08 localhost python3[46289]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:52:09 localhost python3[46305]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:52:09 localhost python3[46353]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:09 localhost python3[46396]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884329.3164322-79237-87657879580995/source _original_basename=tmpkz1ja64k follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:10 localhost python3[46458]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:10 localhost python3[46501]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884330.2286217-79444-156898966445303/source _original_basename=tmp0_s1gr6x follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:11 localhost python3[46531]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 02:52:11 localhost systemd[1]: Reloading. Nov 23 02:52:11 localhost systemd-rc-local-generator[46557]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:52:11 localhost systemd-sysv-generator[46561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:52:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:52:12 localhost python3[46585]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:52:12 localhost python3[46601]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:12 localhost python3[46618]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:12 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Nov 23 02:52:12 localhost systemd[36148]: Created slice User Background Tasks Slice. Nov 23 02:52:12 localhost systemd[36148]: Starting Cleanup of User's Temporary Files and Directories... Nov 23 02:52:12 localhost systemd[36148]: Finished Cleanup of User's Temporary Files and Directories. Nov 23 02:52:13 localhost python3[46636]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:52:13 localhost python3[46652]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:14 localhost python3[46700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:14 localhost python3[46743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884333.8525093-79628-143651833038167/source _original_basename=tmpz0knaay8 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:35 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 23 02:52:37 localhost python3[46775]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:52:38 localhost python3[46791]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Nov 23 02:52:38 localhost python3[46807]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:52:38 localhost python3[46853]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:39 localhost python3[46900]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:39 localhost python3[46916]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 23 02:52:40 localhost kernel: SELinux: Converting 2706 SID table entries... Nov 23 02:52:40 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:52:40 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:52:40 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:52:40 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:52:40 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:52:40 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:52:40 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:52:40 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=13 res=1 Nov 23 02:52:40 localhost python3[46937]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:52:42 localhost python3[47089]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Nov 23 02:52:42 localhost rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Nov 23 02:52:43 localhost python3[47105]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:52:43 localhost python3[47121]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:52:44 localhost python3[47137]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Nov 23 02:52:49 localhost python3[47185]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:52:49 localhost python3[47228]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884369.037533-81103-108734803260782/source _original_basename=tmp9zzq8u8g follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:52:50 localhost python3[47258]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:52:52 localhost python3[47381]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:52:54 localhost python3[47502]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 02:52:56 localhost python3[47518]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:52:57 localhost python3[47535]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 02:53:01 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:53:01 localhost dbus-broker-launch[18442]: Noticed file-system modification, trigger reload. Nov 23 02:53:01 localhost dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 23 02:53:01 localhost dbus-broker-launch[18442]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 23 02:53:01 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:53:01 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:53:01 localhost systemd[1]: Reexecuting. Nov 23 02:53:01 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 23 02:53:01 localhost systemd[1]: Detected virtualization kvm. Nov 23 02:53:01 localhost systemd[1]: Detected architecture x86-64. Nov 23 02:53:01 localhost systemd-sysv-generator[47593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:53:01 localhost systemd-rc-local-generator[47589]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:53:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:53:09 localhost kernel: SELinux: Converting 2706 SID table entries... Nov 23 02:53:09 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 02:53:09 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 02:53:09 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 02:53:09 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 02:53:09 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 02:53:09 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 02:53:09 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 02:53:09 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:53:09 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=14 res=1 Nov 23 02:53:09 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 02:53:11 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:53:11 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 02:53:11 localhost systemd[1]: Reloading. Nov 23 02:53:11 localhost systemd-sysv-generator[47692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:53:11 localhost systemd-rc-local-generator[47689]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:53:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:53:11 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 02:53:11 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:53:11 localhost systemd-journald[618]: Journal stopped Nov 23 02:53:11 localhost systemd[1]: Stopping Journal Service... Nov 23 02:53:11 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Nov 23 02:53:11 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 23 02:53:11 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 23 02:53:11 localhost systemd[1]: Stopped Journal Service. Nov 23 02:53:11 localhost systemd[1]: systemd-journald.service: Consumed 1.962s CPU time. Nov 23 02:53:11 localhost systemd[1]: Starting Journal Service... Nov 23 02:53:11 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 23 02:53:11 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 23 02:53:11 localhost systemd[1]: systemd-udevd.service: Consumed 2.852s CPU time. Nov 23 02:53:11 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 23 02:53:11 localhost systemd-journald[48157]: Journal started Nov 23 02:53:11 localhost systemd-journald[48157]: Runtime Journal (/run/log/journal/6e0090cd4cf296f54418e234b90f721c) is 12.3M, max 314.7M, 302.4M free. Nov 23 02:53:11 localhost systemd[1]: Started Journal Service. Nov 23 02:53:11 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 23 02:53:11 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 02:53:11 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:53:11 localhost systemd-udevd[48163]: Using default interface naming scheme 'rhel-9.0'. Nov 23 02:53:11 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 23 02:53:11 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 02:53:11 localhost systemd[1]: Reloading. Nov 23 02:53:11 localhost systemd-rc-local-generator[48749]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:53:11 localhost systemd-sysv-generator[48753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:53:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:53:12 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 02:53:12 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 02:53:12 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 02:53:12 localhost systemd[1]: man-db-cache-update.service: Consumed 1.109s CPU time. Nov 23 02:53:12 localhost systemd[1]: run-r3fdf4ffa0e604537bfc819f4c898c12f.service: Deactivated successfully. Nov 23 02:53:12 localhost systemd[1]: run-r655657c2d4a944c198df5810dd39d490.service: Deactivated successfully. Nov 23 02:53:13 localhost python3[49024]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Nov 23 02:53:14 localhost python3[49043]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 02:53:15 localhost python3[49061]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:53:15 localhost python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Nov 23 02:53:15 localhost python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Nov 23 02:53:22 localhost podman[49075]: 2025-11-23 07:53:15.775573476 +0000 UTC m=+0.029358083 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:53:22 localhost python3[49061]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json Nov 23 02:53:23 localhost python3[49174]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:53:23 localhost python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Nov 23 02:53:23 localhost python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Nov 23 02:53:29 localhost podman[49186]: 2025-11-23 07:53:23.128911969 +0000 UTC m=+0.046555098 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 02:53:29 localhost python3[49174]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json Nov 23 02:53:30 localhost python3[49288]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:53:30 localhost python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Nov 23 02:53:30 localhost python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Nov 23 02:53:45 localhost podman[49300]: 2025-11-23 07:53:30.293209837 +0000 UTC m=+0.047090522 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 02:53:45 localhost python3[49288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json Nov 23 02:53:46 localhost systemd[1]: tmp-crun.Prr2vM.mount: Deactivated successfully. Nov 23 02:53:46 localhost python3[49667]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:53:46 localhost python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Nov 23 02:53:46 localhost podman[49660]: 2025-11-23 07:53:46.301955832 +0000 UTC m=+0.193746378 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, release=553, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 02:53:46 localhost python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Nov 23 02:53:46 localhost podman[49660]: 2025-11-23 07:53:46.404162044 +0000 UTC m=+0.295952600 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container) Nov 23 02:53:59 localhost podman[49692]: 2025-11-23 07:53:46.368989487 +0000 UTC m=+0.023558761 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 02:53:59 localhost python3[49667]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json Nov 23 02:53:59 localhost python3[49933]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:53:59 localhost python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Nov 23 02:53:59 localhost python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Nov 23 02:54:07 localhost podman[49945]: 2025-11-23 07:53:59.476287292 +0000 UTC m=+0.033104150 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 23 02:54:07 localhost python3[49933]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json Nov 23 02:54:07 localhost python3[50105]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:07 localhost python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Nov 23 02:54:07 localhost python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Nov 23 02:54:12 localhost podman[50118]: 2025-11-23 07:54:07.879180255 +0000 UTC m=+0.050725529 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 23 02:54:12 localhost python3[50105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json Nov 23 02:54:12 localhost python3[50195]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:12 localhost python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Nov 23 02:54:12 localhost python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Nov 23 02:54:14 localhost podman[50209]: 2025-11-23 07:54:12.536076503 +0000 UTC m=+0.041477251 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 23 02:54:14 localhost python3[50195]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json Nov 23 02:54:15 localhost python3[50288]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:15 localhost python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Nov 23 02:54:15 localhost python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Nov 23 02:54:17 localhost podman[50300]: 2025-11-23 07:54:15.275129277 +0000 UTC m=+0.044555270 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 23 02:54:17 localhost python3[50288]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json Nov 23 02:54:18 localhost python3[50376]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:18 localhost python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Nov 23 02:54:18 localhost python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Nov 23 02:54:20 localhost podman[50388]: 2025-11-23 07:54:18.363131752 +0000 UTC m=+0.041716238 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 23 02:54:20 localhost python3[50376]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json Nov 23 02:54:21 localhost python3[50467]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:21 localhost python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Nov 23 02:54:21 localhost python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Nov 23 02:54:25 localhost podman[50480]: 2025-11-23 07:54:21.411231988 +0000 UTC m=+0.044006212 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 02:54:25 localhost python3[50467]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json Nov 23 02:54:25 localhost python3[50570]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 23 02:54:25 localhost python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Nov 23 02:54:25 localhost python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Nov 23 02:54:27 localhost podman[50583]: 2025-11-23 07:54:25.473862262 +0000 UTC m=+0.041855693 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 23 02:54:27 localhost python3[50570]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json Nov 23 02:54:28 localhost python3[50661]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:54:30 localhost ansible-async_wrapper.py[50833]: Invoked with 702276533013 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884469.4670262-83773-211607189789818/AnsiballZ_command.py _ Nov 23 02:54:30 localhost ansible-async_wrapper.py[50836]: Starting module and watcher Nov 23 02:54:30 localhost ansible-async_wrapper.py[50836]: Start watching 50837 (3600) Nov 23 02:54:30 localhost ansible-async_wrapper.py[50837]: Start module (50837) Nov 23 02:54:30 localhost ansible-async_wrapper.py[50833]: Return async_wrapper task started. Nov 23 02:54:30 localhost python3[50857]: ansible-ansible.legacy.async_status Invoked with jid=702276533013.50833 mode=status _async_dir=/tmp/.ansible_async Nov 23 02:54:33 localhost puppet-user[50841]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:33 localhost puppet-user[50841]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:33 localhost puppet-user[50841]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:33 localhost puppet-user[50841]: (file & line not available) Nov 23 02:54:33 localhost puppet-user[50841]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:33 localhost puppet-user[50841]: (file & line not available) Nov 23 02:54:33 localhost puppet-user[50841]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 23 02:54:34 localhost puppet-user[50841]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 23 02:54:34 localhost puppet-user[50841]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.17 seconds Nov 23 02:54:34 localhost puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Nov 23 02:54:34 localhost puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Nov 23 02:54:34 localhost puppet-user[50841]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Nov 23 02:54:34 localhost puppet-user[50841]: Notice: Applied catalog in 0.05 seconds Nov 23 02:54:34 localhost puppet-user[50841]: Application: Nov 23 02:54:34 localhost puppet-user[50841]: Initial environment: production Nov 23 02:54:34 localhost puppet-user[50841]: Converged environment: production Nov 23 02:54:34 localhost puppet-user[50841]: Run mode: user Nov 23 02:54:34 localhost puppet-user[50841]: Changes: Nov 23 02:54:34 localhost puppet-user[50841]: Total: 3 Nov 23 02:54:34 localhost puppet-user[50841]: Events: Nov 23 02:54:34 localhost puppet-user[50841]: Success: 3 Nov 23 02:54:34 localhost puppet-user[50841]: Total: 3 Nov 23 02:54:34 localhost puppet-user[50841]: Resources: Nov 23 02:54:34 localhost puppet-user[50841]: Changed: 3 Nov 23 02:54:34 localhost puppet-user[50841]: Out of sync: 3 Nov 23 02:54:34 localhost puppet-user[50841]: Total: 10 Nov 23 02:54:34 localhost puppet-user[50841]: Time: Nov 23 02:54:34 localhost puppet-user[50841]: Schedule: 0.00 Nov 23 02:54:34 localhost puppet-user[50841]: File: 0.00 Nov 23 02:54:34 localhost puppet-user[50841]: Augeas: 0.02 Nov 23 02:54:34 localhost puppet-user[50841]: Exec: 0.02 Nov 23 02:54:34 localhost puppet-user[50841]: Transaction evaluation: 0.05 Nov 23 02:54:34 localhost puppet-user[50841]: Catalog application: 0.05 Nov 23 02:54:34 localhost puppet-user[50841]: Config retrieval: 0.20 Nov 23 02:54:34 localhost puppet-user[50841]: Last run: 1763884474 Nov 23 02:54:34 localhost puppet-user[50841]: Filebucket: 0.00 Nov 23 02:54:34 localhost puppet-user[50841]: Total: 0.05 Nov 23 02:54:34 localhost puppet-user[50841]: Version: Nov 23 02:54:34 localhost puppet-user[50841]: Config: 1763884473 Nov 23 02:54:34 localhost puppet-user[50841]: Puppet: 7.10.0 Nov 23 02:54:34 localhost ansible-async_wrapper.py[50837]: Module complete (50837) Nov 23 02:54:35 localhost ansible-async_wrapper.py[50836]: Done in kid B. Nov 23 02:54:40 localhost python3[50985]: ansible-ansible.legacy.async_status Invoked with jid=702276533013.50833 mode=status _async_dir=/tmp/.ansible_async Nov 23 02:54:41 localhost python3[51001]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:54:41 localhost python3[51017]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:54:42 localhost python3[51065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:54:42 localhost python3[51108]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884481.9806762-84065-154904079588825/source _original_basename=tmpbxs3tk95 follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 02:54:43 localhost python3[51138]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:54:44 localhost python3[51242]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 23 02:54:44 localhost python3[51261]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 02:54:45 localhost python3[51277]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005532585 step=1 update_config_hash_only=False Nov 23 02:54:45 localhost python3[51293]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:54:46 localhost python3[51309]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 02:54:47 localhost python3[51325]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 02:54:48 localhost python3[51367]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Nov 23 02:54:48 localhost podman[51559]: 2025-11-23 07:54:48.649483492 +0000 UTC m=+0.058567890 container create f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=container-puppet-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 02:54:48 localhost podman[51567]: 2025-11-23 07:54:48.671050564 +0000 UTC m=+0.071105092 container create 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_puppet_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 23 02:54:48 localhost systemd[1]: Started libpod-conmon-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope. Nov 23 02:54:48 localhost systemd[1]: Started libcrun container. Nov 23 02:54:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:48 localhost podman[51559]: 2025-11-23 07:54:48.702498863 +0000 UTC m=+0.111583261 container init f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd) Nov 23 02:54:48 localhost podman[51581]: 2025-11-23 07:54:48.708045331 +0000 UTC m=+0.095029970 container create 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_id=tripleo_puppet_step1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Nov 23 02:54:48 localhost systemd[1]: Started libpod-conmon-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope. Nov 23 02:54:48 localhost podman[51559]: 2025-11-23 07:54:48.618533889 +0000 UTC m=+0.027618317 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 23 02:54:48 localhost podman[51561]: 2025-11-23 07:54:48.72363081 +0000 UTC m=+0.125971142 container create 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, tcib_managed=true, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git) Nov 23 02:54:48 localhost systemd[1]: Started libcrun container. Nov 23 02:54:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:48 localhost podman[51561]: 2025-11-23 07:54:48.632968972 +0000 UTC m=+0.035309304 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 02:54:48 localhost systemd[1]: Started libpod-conmon-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope. Nov 23 02:54:48 localhost systemd[1]: Started libcrun container. Nov 23 02:54:48 localhost podman[51567]: 2025-11-23 07:54:48.641826206 +0000 UTC m=+0.041880754 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:54:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:48 localhost podman[51582]: 2025-11-23 07:54:48.645555776 +0000 UTC m=+0.034992664 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 23 02:54:48 localhost podman[51581]: 2025-11-23 07:54:48.650074241 +0000 UTC m=+0.037058910 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 23 02:54:49 localhost systemd[1]: Started libpod-conmon-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope. Nov 23 02:54:49 localhost systemd[1]: Started libcrun container. Nov 23 02:54:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:49 localhost podman[51582]: 2025-11-23 07:54:49.86885633 +0000 UTC m=+1.258293248 container create 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, container_name=container-puppet-iscsid) Nov 23 02:54:49 localhost podman[51567]: 2025-11-23 07:54:49.901112285 +0000 UTC m=+1.301166823 container init 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, container_name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, config_id=tripleo_puppet_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 02:54:49 localhost systemd[1]: Started libpod-conmon-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope. Nov 23 02:54:49 localhost podman[51581]: 2025-11-23 07:54:49.922077657 +0000 UTC m=+1.309062366 container init 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_id=tripleo_puppet_step1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.12, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=) Nov 23 02:54:49 localhost podman[51581]: 2025-11-23 07:54:49.936237602 +0000 UTC m=+1.323222241 container start 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_puppet_step1, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-crond, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z) Nov 23 02:54:49 localhost podman[51581]: 2025-11-23 07:54:49.936457828 +0000 UTC m=+1.323442557 container attach 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_puppet_step1, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12) Nov 23 02:54:49 localhost systemd[1]: Started libcrun container. Nov 23 02:54:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:49 localhost podman[51582]: 2025-11-23 07:54:49.956956377 +0000 UTC m=+1.346393265 container init 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 23 02:54:49 localhost podman[51567]: 2025-11-23 07:54:49.963459585 +0000 UTC m=+1.363514153 container start 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, container_name=container-puppet-metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 02:54:49 localhost podman[51567]: 2025-11-23 07:54:49.963797726 +0000 UTC m=+1.363852294 container attach 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, container_name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1) Nov 23 02:54:49 localhost podman[51582]: 2025-11-23 07:54:49.968088474 +0000 UTC m=+1.357525352 container start 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 02:54:49 localhost podman[51582]: 2025-11-23 07:54:49.96826935 +0000 UTC m=+1.357706268 container attach 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 02:54:50 localhost podman[51561]: 2025-11-23 07:54:50.003975295 +0000 UTC m=+1.406315637 container init 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.12, container_name=container-puppet-nova_libvirt, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt) Nov 23 02:54:50 localhost podman[51561]: 2025-11-23 07:54:50.010830235 +0000 UTC m=+1.413170567 container start 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, maintainer=OpenStack TripleO Team, container_name=container-puppet-nova_libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 02:54:50 localhost podman[51561]: 2025-11-23 07:54:50.013017615 +0000 UTC m=+1.415357957 container attach 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 02:54:50 localhost podman[51559]: 2025-11-23 07:54:50.022486529 +0000 UTC m=+1.431570937 container start f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, container_name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z) Nov 23 02:54:50 localhost podman[51559]: 2025-11-23 07:54:50.022910452 +0000 UTC m=+1.431994870 container attach f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, container_name=container-puppet-collectd, config_id=tripleo_puppet_step1, vcs-type=git, name=rhosp17/openstack-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 23 02:54:50 localhost podman[51471]: 2025-11-23 07:54:48.535512185 +0000 UTC m=+0.034105895 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 23 02:54:50 localhost podman[51833]: 2025-11-23 07:54:50.733796308 +0000 UTC m=+0.025984784 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 23 02:54:50 localhost podman[51833]: 2025-11-23 07:54:50.760472955 +0000 UTC m=+0.052661401 container create 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, build-date=2025-11-19T00:11:59Z, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 23 02:54:50 localhost systemd[1]: Started libpod-conmon-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope. Nov 23 02:54:50 localhost systemd[1]: Started libcrun container. Nov 23 02:54:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:50 localhost podman[51833]: 2025-11-23 07:54:50.802418019 +0000 UTC m=+0.094606505 container init 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=container-puppet-ceilometer, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 23 02:54:50 localhost podman[51833]: 2025-11-23 07:54:50.812953597 +0000 UTC m=+0.105142093 container start 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, container_name=container-puppet-ceilometer, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Nov 23 02:54:50 localhost podman[51833]: 2025-11-23 07:54:50.813172744 +0000 UTC m=+0.105361230 container attach 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:11:59Z, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1) Nov 23 02:54:51 localhost puppet-user[51721]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:51 localhost puppet-user[51721]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:51 localhost puppet-user[51721]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:51 localhost puppet-user[51721]: (file & line not available) Nov 23 02:54:51 localhost ovs-vsctl[52057]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 23 02:54:51 localhost puppet-user[51721]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:51 localhost puppet-user[51721]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51761]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:51 localhost puppet-user[51761]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:51 localhost puppet-user[51761]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:51 localhost puppet-user[51761]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51752]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:51 localhost puppet-user[51752]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:51 localhost puppet-user[51752]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:51 localhost puppet-user[51752]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51761]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:51 localhost puppet-user[51761]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51752]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:51 localhost puppet-user[51752]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51761]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.08 seconds Nov 23 02:54:51 localhost puppet-user[51752]: Notice: Accepting previously invalid value for target type 'Integer' Nov 23 02:54:51 localhost puppet-user[51761]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Nov 23 02:54:51 localhost puppet-user[51752]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.13 seconds Nov 23 02:54:51 localhost puppet-user[51761]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Nov 23 02:54:51 localhost puppet-user[51787]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:51 localhost puppet-user[51787]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:51 localhost puppet-user[51787]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:51 localhost puppet-user[51787]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51770]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:51 localhost puppet-user[51770]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:51 localhost puppet-user[51761]: Notice: Applied catalog in 0.05 seconds Nov 23 02:54:51 localhost puppet-user[51770]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:51 localhost puppet-user[51770]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51761]: Application: Nov 23 02:54:51 localhost puppet-user[51761]: Initial environment: production Nov 23 02:54:51 localhost puppet-user[51761]: Converged environment: production Nov 23 02:54:51 localhost puppet-user[51761]: Run mode: user Nov 23 02:54:51 localhost puppet-user[51761]: Changes: Nov 23 02:54:51 localhost puppet-user[51761]: Total: 2 Nov 23 02:54:51 localhost puppet-user[51761]: Events: Nov 23 02:54:51 localhost puppet-user[51761]: Success: 2 Nov 23 02:54:51 localhost puppet-user[51761]: Total: 2 Nov 23 02:54:51 localhost puppet-user[51761]: Resources: Nov 23 02:54:51 localhost puppet-user[51761]: Changed: 2 Nov 23 02:54:51 localhost puppet-user[51761]: Out of sync: 2 Nov 23 02:54:51 localhost puppet-user[51761]: Skipped: 7 Nov 23 02:54:51 localhost puppet-user[51761]: Total: 9 Nov 23 02:54:51 localhost puppet-user[51761]: Time: Nov 23 02:54:51 localhost puppet-user[51761]: Cron: 0.01 Nov 23 02:54:51 localhost puppet-user[51761]: File: 0.02 Nov 23 02:54:51 localhost puppet-user[51761]: Transaction evaluation: 0.05 Nov 23 02:54:51 localhost puppet-user[51761]: Catalog application: 0.05 Nov 23 02:54:51 localhost puppet-user[51761]: Config retrieval: 0.10 Nov 23 02:54:51 localhost puppet-user[51761]: Last run: 1763884491 Nov 23 02:54:51 localhost puppet-user[51761]: Total: 0.05 Nov 23 02:54:51 localhost puppet-user[51761]: Version: Nov 23 02:54:51 localhost puppet-user[51761]: Config: 1763884491 Nov 23 02:54:51 localhost puppet-user[51761]: Puppet: 7.10.0 Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Nov 23 02:54:51 localhost puppet-user[51770]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:51 localhost puppet-user[51770]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51787]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:51 localhost puppet-user[51787]: (file & line not available) Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}6cc35a9378c83bbb3e141511ca4580116e7dbe45274752dd8576577f368bbe29' Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Nov 23 02:54:51 localhost puppet-user[51752]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Nov 23 02:54:51 localhost puppet-user[51752]: Notice: Applied catalog in 0.03 seconds Nov 23 02:54:51 localhost puppet-user[51752]: Application: Nov 23 02:54:51 localhost puppet-user[51752]: Initial environment: production Nov 23 02:54:51 localhost puppet-user[51752]: Converged environment: production Nov 23 02:54:51 localhost puppet-user[51752]: Run mode: user Nov 23 02:54:51 localhost puppet-user[51752]: Changes: Nov 23 02:54:51 localhost puppet-user[51752]: Total: 7 Nov 23 02:54:51 localhost puppet-user[51752]: Events: Nov 23 02:54:51 localhost puppet-user[51752]: Success: 7 Nov 23 02:54:51 localhost puppet-user[51752]: Total: 7 Nov 23 02:54:51 localhost puppet-user[51752]: Resources: Nov 23 02:54:51 localhost puppet-user[51752]: Skipped: 13 Nov 23 02:54:51 localhost puppet-user[51752]: Changed: 5 Nov 23 02:54:51 localhost puppet-user[51752]: Out of sync: 5 Nov 23 02:54:51 localhost puppet-user[51752]: Total: 20 Nov 23 02:54:51 localhost puppet-user[51752]: Time: Nov 23 02:54:51 localhost puppet-user[51752]: File: 0.02 Nov 23 02:54:51 localhost puppet-user[51752]: Transaction evaluation: 0.03 Nov 23 02:54:51 localhost puppet-user[51752]: Catalog application: 0.03 Nov 23 02:54:51 localhost puppet-user[51752]: Config retrieval: 0.16 Nov 23 02:54:51 localhost puppet-user[51752]: Last run: 1763884491 Nov 23 02:54:51 localhost puppet-user[51752]: Total: 0.03 Nov 23 02:54:51 localhost puppet-user[51752]: Version: Nov 23 02:54:51 localhost puppet-user[51752]: Config: 1763884491 Nov 23 02:54:51 localhost puppet-user[51752]: Puppet: 7.10.0 Nov 23 02:54:51 localhost puppet-user[51770]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.10 seconds Nov 23 02:54:52 localhost puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Nov 23 02:54:52 localhost puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Nov 23 02:54:52 localhost puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Nov 23 02:54:52 localhost puppet-user[51721]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.37 seconds Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Nov 23 02:54:52 localhost puppet-user[51787]: in a future release. Use nova::cinder::os_region_name instead Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Nov 23 02:54:52 localhost puppet-user[51787]: in a future release. Use nova::cinder::catalog_info instead Nov 23 02:54:52 localhost systemd[1]: libpod-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: libpod-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Consumed 2.053s CPU time. Nov 23 02:54:52 localhost podman[51581]: 2025-11-23 07:54:52.140854487 +0000 UTC m=+3.527839136 container died 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044) Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Nov 23 02:54:52 localhost systemd[1]: tmp-crun.pLZQoj.mount: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: libpod-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: libpod-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Consumed 2.128s CPU time. Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Nov 23 02:54:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: var-lib-containers-storage-overlay-7e50ae09132747217ca21393b750de400f878b516acd5755409810361a82c81b-merged.mount: Deactivated successfully. Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Nov 23 02:54:52 localhost podman[52267]: 2025-11-23 07:54:52.260274909 +0000 UTC m=+0.111551610 container cleanup 4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 02:54:52 localhost systemd[1]: libpod-conmon-4ee9b8fcaa60b9dbdc5d8e76a5962df69e97a532c58d7ce51467d68816612014.scope: Deactivated successfully. Nov 23 02:54:52 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Nov 23 02:54:52 localhost podman[51567]: 2025-11-23 07:54:52.270814357 +0000 UTC m=+3.670868895 container died 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, distribution-scope=public) Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Nov 23 02:54:52 localhost podman[52293]: 2025-11-23 07:54:52.315292504 +0000 UTC m=+0.088281053 container cleanup 3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Nov 23 02:54:52 localhost systemd[1]: libpod-conmon-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656.scope: Deactivated successfully. Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Nov 23 02:54:52 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Nov 23 02:54:52 localhost puppet-user[51721]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Nov 23 02:54:52 localhost puppet-user[51721]: Notice: Applied catalog in 0.24 seconds Nov 23 02:54:52 localhost puppet-user[51721]: Application: Nov 23 02:54:52 localhost puppet-user[51721]: Initial environment: production Nov 23 02:54:52 localhost puppet-user[51721]: Converged environment: production Nov 23 02:54:52 localhost puppet-user[51721]: Run mode: user Nov 23 02:54:52 localhost puppet-user[51721]: Changes: Nov 23 02:54:52 localhost puppet-user[51721]: Total: 43 Nov 23 02:54:52 localhost puppet-user[51721]: Events: Nov 23 02:54:52 localhost puppet-user[51721]: Success: 43 Nov 23 02:54:52 localhost puppet-user[51721]: Total: 43 Nov 23 02:54:52 localhost puppet-user[51721]: Resources: Nov 23 02:54:52 localhost puppet-user[51721]: Skipped: 14 Nov 23 02:54:52 localhost puppet-user[51721]: Changed: 38 Nov 23 02:54:52 localhost puppet-user[51721]: Out of sync: 38 Nov 23 02:54:52 localhost puppet-user[51721]: Total: 82 Nov 23 02:54:52 localhost puppet-user[51721]: Time: Nov 23 02:54:52 localhost puppet-user[51721]: Concat file: 0.00 Nov 23 02:54:52 localhost puppet-user[51721]: File: 0.10 Nov 23 02:54:52 localhost puppet-user[51721]: Transaction evaluation: 0.24 Nov 23 02:54:52 localhost puppet-user[51721]: Catalog application: 0.24 Nov 23 02:54:52 localhost puppet-user[51721]: Config retrieval: 0.45 Nov 23 02:54:52 localhost puppet-user[51721]: Last run: 1763884492 Nov 23 02:54:52 localhost puppet-user[51721]: Concat fragment: 0.00 Nov 23 02:54:52 localhost puppet-user[51721]: Total: 0.24 Nov 23 02:54:52 localhost puppet-user[51721]: Version: Nov 23 02:54:52 localhost puppet-user[51721]: Config: 1763884491 Nov 23 02:54:52 localhost puppet-user[51721]: Puppet: 7.10.0 Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Nov 23 02:54:52 localhost puppet-user[51770]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Nov 23 02:54:52 localhost puppet-user[51770]: Notice: Applied catalog in 0.48 seconds Nov 23 02:54:52 localhost puppet-user[51770]: Application: Nov 23 02:54:52 localhost puppet-user[51770]: Initial environment: production Nov 23 02:54:52 localhost puppet-user[51770]: Converged environment: production Nov 23 02:54:52 localhost puppet-user[51770]: Run mode: user Nov 23 02:54:52 localhost puppet-user[51770]: Changes: Nov 23 02:54:52 localhost puppet-user[51770]: Total: 4 Nov 23 02:54:52 localhost puppet-user[51770]: Events: Nov 23 02:54:52 localhost puppet-user[51770]: Success: 4 Nov 23 02:54:52 localhost puppet-user[51770]: Total: 4 Nov 23 02:54:52 localhost puppet-user[51770]: Resources: Nov 23 02:54:52 localhost puppet-user[51770]: Changed: 4 Nov 23 02:54:52 localhost puppet-user[51770]: Out of sync: 4 Nov 23 02:54:52 localhost puppet-user[51770]: Skipped: 8 Nov 23 02:54:52 localhost puppet-user[51770]: Total: 13 Nov 23 02:54:52 localhost puppet-user[51770]: Time: Nov 23 02:54:52 localhost puppet-user[51770]: File: 0.00 Nov 23 02:54:52 localhost puppet-user[51770]: Exec: 0.05 Nov 23 02:54:52 localhost puppet-user[51770]: Config retrieval: 0.13 Nov 23 02:54:52 localhost puppet-user[51770]: Augeas: 0.42 Nov 23 02:54:52 localhost puppet-user[51770]: Transaction evaluation: 0.47 Nov 23 02:54:52 localhost puppet-user[51770]: Catalog application: 0.48 Nov 23 02:54:52 localhost puppet-user[51770]: Last run: 1763884492 Nov 23 02:54:52 localhost puppet-user[51770]: Total: 0.48 Nov 23 02:54:52 localhost puppet-user[51770]: Version: Nov 23 02:54:52 localhost puppet-user[51770]: Config: 1763884491 Nov 23 02:54:52 localhost puppet-user[51770]: Puppet: 7.10.0 Nov 23 02:54:52 localhost puppet-user[51787]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Nov 23 02:54:52 localhost systemd[1]: var-lib-containers-storage-overlay-112d325a59c9fbd4bb36b26505654d46d5cee939e874b7beb1773b25ed0d0f87-merged.mount: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3685fcb6ea0da8407c0d6d37d2249f249f81b1dc54f053e5b6248d3d8d6fa656-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:52 localhost podman[52438]: 2025-11-23 07:54:52.696582826 +0000 UTC m=+0.075484903 container create 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, release=1761123044, config_id=tripleo_puppet_step1, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 23 02:54:52 localhost systemd[1]: libpod-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: libpod-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Consumed 2.695s CPU time. Nov 23 02:54:52 localhost podman[51559]: 2025-11-23 07:54:52.720806153 +0000 UTC m=+4.129890581 container died f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, name=rhosp17/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 02:54:52 localhost systemd[1]: Started libpod-conmon-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope. Nov 23 02:54:52 localhost systemd[1]: libpod-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Deactivated successfully. Nov 23 02:54:52 localhost systemd[1]: libpod-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Consumed 2.638s CPU time. Nov 23 02:54:52 localhost podman[52452]: 2025-11-23 07:54:52.73691419 +0000 UTC m=+0.102080906 container create a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true) Nov 23 02:54:52 localhost podman[51582]: 2025-11-23 07:54:52.739873414 +0000 UTC m=+4.129310302 container died 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=container-puppet-iscsid, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 02:54:52 localhost systemd[1]: Started libcrun container. Nov 23 02:54:52 localhost podman[52438]: 2025-11-23 07:54:52.648429551 +0000 UTC m=+0.027331658 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 23 02:54:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:52 localhost podman[52438]: 2025-11-23 07:54:52.7575047 +0000 UTC m=+0.136406767 container init 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, container_name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, config_id=tripleo_puppet_step1, tcib_managed=true, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team) Nov 23 02:54:52 localhost podman[52438]: 2025-11-23 07:54:52.767090328 +0000 UTC m=+0.145992395 container start 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 02:54:52 localhost podman[52438]: 2025-11-23 07:54:52.767290884 +0000 UTC m=+0.146192951 container attach 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=container-puppet-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog) Nov 23 02:54:52 localhost podman[52452]: 2025-11-23 07:54:52.689604992 +0000 UTC m=+0.054771758 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 02:54:52 localhost puppet-user[51866]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:52 localhost puppet-user[51866]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:52 localhost puppet-user[51866]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:52 localhost puppet-user[51866]: (file & line not available) Nov 23 02:54:52 localhost systemd[1]: Started libpod-conmon-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope. Nov 23 02:54:52 localhost systemd[1]: Started libcrun container. Nov 23 02:54:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:52 localhost puppet-user[51866]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:52 localhost puppet-user[51866]: (file & line not available) Nov 23 02:54:52 localhost podman[52452]: 2025-11-23 07:54:52.873843252 +0000 UTC m=+0.239009968 container init a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 23 02:54:52 localhost podman[52530]: 2025-11-23 07:54:52.877987185 +0000 UTC m=+0.128451721 container cleanup 586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, container_name=container-puppet-iscsid, release=1761123044) Nov 23 02:54:52 localhost podman[52452]: 2025-11-23 07:54:52.882515181 +0000 UTC m=+0.247681887 container start a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, distribution-scope=public, container_name=container-puppet-ovn_controller, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 02:54:52 localhost podman[52452]: 2025-11-23 07:54:52.882732627 +0000 UTC m=+0.247899373 container attach a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044) Nov 23 02:54:52 localhost systemd[1]: libpod-conmon-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf.scope: Deactivated successfully. Nov 23 02:54:52 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 23 02:54:52 localhost podman[52511]: 2025-11-23 07:54:52.915794398 +0000 UTC m=+0.183221269 container cleanup f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-collectd, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64) Nov 23 02:54:52 localhost systemd[1]: libpod-conmon-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41.scope: Deactivated successfully. Nov 23 02:54:52 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Nov 23 02:54:53 localhost puppet-user[51866]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Nov 23 02:54:53 localhost puppet-user[51787]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 1.33 seconds Nov 23 02:54:53 localhost puppet-user[51866]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.42 seconds Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}3fd4b82820ca431560a9101649124ba519ce5d6bf5755c5a232928b76e10eb6c' Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Warning: Empty environment setting 'TLS_PASSWORD' Nov 23 02:54:53 localhost puppet-user[51787]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}bf4205704c2ce3336692c7289c134cb4f34ad9637d3b2e0917c09fb097bf6f77' Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Nov 23 02:54:53 localhost systemd[1]: var-lib-containers-storage-overlay-28863d7faf7374b2b39d9876f4ac893892c1ecb19637ae35f5c500f77c14b246-merged.mount: Deactivated successfully. Nov 23 02:54:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-586be1c618d90568a3b891b5c772cc28480e64bf7e708a3d09687baa7acee4cf-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:53 localhost systemd[1]: var-lib-containers-storage-overlay-25162626903c4a9c9a99b1d254b734327e6597d283273669c57c478e44b2d310-merged.mount: Deactivated successfully. Nov 23 02:54:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f52c9c33aaa72ff2f9b3500f3773532b8629fc7a9598421f83f9724f59efac41-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Nov 23 02:54:53 localhost puppet-user[51866]: Notice: Applied catalog in 0.43 seconds Nov 23 02:54:53 localhost puppet-user[51866]: Application: Nov 23 02:54:53 localhost puppet-user[51866]: Initial environment: production Nov 23 02:54:53 localhost puppet-user[51866]: Converged environment: production Nov 23 02:54:53 localhost puppet-user[51866]: Run mode: user Nov 23 02:54:53 localhost puppet-user[51866]: Changes: Nov 23 02:54:53 localhost puppet-user[51866]: Total: 31 Nov 23 02:54:53 localhost puppet-user[51866]: Events: Nov 23 02:54:53 localhost puppet-user[51866]: Success: 31 Nov 23 02:54:53 localhost puppet-user[51866]: Total: 31 Nov 23 02:54:53 localhost puppet-user[51866]: Resources: Nov 23 02:54:53 localhost puppet-user[51866]: Skipped: 22 Nov 23 02:54:53 localhost puppet-user[51866]: Changed: 31 Nov 23 02:54:53 localhost puppet-user[51866]: Out of sync: 31 Nov 23 02:54:53 localhost puppet-user[51866]: Total: 151 Nov 23 02:54:53 localhost puppet-user[51866]: Time: Nov 23 02:54:53 localhost puppet-user[51866]: Package: 0.02 Nov 23 02:54:53 localhost puppet-user[51866]: Ceilometer config: 0.34 Nov 23 02:54:53 localhost puppet-user[51866]: Transaction evaluation: 0.42 Nov 23 02:54:53 localhost puppet-user[51866]: Catalog application: 0.43 Nov 23 02:54:53 localhost puppet-user[51866]: Config retrieval: 0.49 Nov 23 02:54:53 localhost puppet-user[51866]: Last run: 1763884493 Nov 23 02:54:53 localhost puppet-user[51866]: Resources: 0.00 Nov 23 02:54:53 localhost puppet-user[51866]: Total: 0.43 Nov 23 02:54:53 localhost puppet-user[51866]: Version: Nov 23 02:54:53 localhost puppet-user[51866]: Config: 1763884492 Nov 23 02:54:53 localhost puppet-user[51866]: Puppet: 7.10.0 Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Nov 23 02:54:53 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Nov 23 02:54:54 localhost systemd[1]: libpod-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Deactivated successfully. Nov 23 02:54:54 localhost systemd[1]: libpod-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Consumed 3.129s CPU time. Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Nov 23 02:54:54 localhost podman[52750]: 2025-11-23 07:54:54.247098517 +0000 UTC m=+0.041662978 container died 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, architecture=x86_64) Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Nov 23 02:54:54 localhost systemd[1]: tmp-crun.ALByks.mount: Deactivated successfully. Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Nov 23 02:54:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Nov 23 02:54:54 localhost podman[52750]: 2025-11-23 07:54:54.303846787 +0000 UTC m=+0.098411228 container cleanup 59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, container_name=container-puppet-ceilometer, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:59Z, name=rhosp17/openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 02:54:54 localhost systemd[1]: libpod-conmon-59724bee6f4906ad69b5e2d76f594ff484935b318425c22e8dd62d70afb127db.scope: Deactivated successfully. Nov 23 02:54:54 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:54 localhost puppet-user[52612]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:54 localhost puppet-user[52612]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:54 localhost puppet-user[52612]: (file & line not available) Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:54 localhost puppet-user[52612]: (file & line not available) Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Nov 23 02:54:54 localhost systemd[1]: var-lib-containers-storage-overlay-4af87402ffde9ee67f7ec4afd129c011169c73f7a6b2401b07ae2b65c68e82e0-merged.mount: Deactivated successfully. Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.24 seconds Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Nov 23 02:54:54 localhost puppet-user[52641]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:54 localhost puppet-user[52641]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:54 localhost puppet-user[52641]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:54 localhost puppet-user[52641]: (file & line not available) Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Nov 23 02:54:54 localhost puppet-user[52641]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:54 localhost puppet-user[52641]: (file & line not available) Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}0ddcdfaeaf89f6f6daa2ee30146631a4c926f7b57df70d985d0c7a45c4b18db9' Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[52612]: Notice: Applied catalog in 0.12 seconds Nov 23 02:54:54 localhost puppet-user[52612]: Application: Nov 23 02:54:54 localhost puppet-user[52612]: Initial environment: production Nov 23 02:54:54 localhost puppet-user[52612]: Converged environment: production Nov 23 02:54:54 localhost puppet-user[52612]: Run mode: user Nov 23 02:54:54 localhost puppet-user[52612]: Changes: Nov 23 02:54:54 localhost puppet-user[52612]: Total: 3 Nov 23 02:54:54 localhost puppet-user[52612]: Events: Nov 23 02:54:54 localhost puppet-user[52612]: Success: 3 Nov 23 02:54:54 localhost puppet-user[52612]: Total: 3 Nov 23 02:54:54 localhost puppet-user[52612]: Resources: Nov 23 02:54:54 localhost puppet-user[52612]: Skipped: 11 Nov 23 02:54:54 localhost puppet-user[52612]: Changed: 3 Nov 23 02:54:54 localhost puppet-user[52612]: Out of sync: 3 Nov 23 02:54:54 localhost puppet-user[52612]: Total: 25 Nov 23 02:54:54 localhost puppet-user[52612]: Time: Nov 23 02:54:54 localhost puppet-user[52612]: Concat file: 0.00 Nov 23 02:54:54 localhost puppet-user[52612]: Concat fragment: 0.00 Nov 23 02:54:54 localhost puppet-user[52612]: File: 0.01 Nov 23 02:54:54 localhost puppet-user[52612]: Transaction evaluation: 0.11 Nov 23 02:54:54 localhost puppet-user[52612]: Catalog application: 0.12 Nov 23 02:54:54 localhost puppet-user[52612]: Config retrieval: 0.29 Nov 23 02:54:54 localhost puppet-user[52612]: Last run: 1763884494 Nov 23 02:54:54 localhost puppet-user[52612]: Total: 0.12 Nov 23 02:54:54 localhost puppet-user[52612]: Version: Nov 23 02:54:54 localhost puppet-user[52612]: Config: 1763884494 Nov 23 02:54:54 localhost puppet-user[52612]: Puppet: 7.10.0 Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Nov 23 02:54:54 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.22 seconds Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52932]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52934]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}7457979272b158ac88adf13552cc58cb87586b19a7b8e2158301712e847fdf72' Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52947]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107 Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52953]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005532585.localdomain Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005532585.novalocal' to 'np0005532585.localdomain' Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52960]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52962]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Nov 23 02:54:55 localhost systemd[1]: libpod-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Deactivated successfully. Nov 23 02:54:55 localhost systemd[1]: libpod-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Consumed 2.348s CPU time. Nov 23 02:54:55 localhost podman[52438]: 2025-11-23 07:54:55.281699457 +0000 UTC m=+2.660601544 container died 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, vcs-type=git, architecture=x86_64, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, container_name=container-puppet-rsyslog) Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52976]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52983]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Nov 23 02:54:55 localhost systemd[1]: tmp-crun.XMSLnW.mount: Deactivated successfully. Nov 23 02:54:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:55 localhost ovs-vsctl[52986]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[52988]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Nov 23 02:54:55 localhost podman[52969]: 2025-11-23 07:54:55.41953637 +0000 UTC m=+0.129827306 container cleanup 40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-rsyslog, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Nov 23 02:54:55 localhost ovs-vsctl[52990]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:25:6a:6e Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Nov 23 02:54:55 localhost systemd[1]: libpod-conmon-40f0a3621dcce51abca77aedb0306508c427be671fc08b5016295abe9991ed72.scope: Deactivated successfully. Nov 23 02:54:55 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 23 02:54:55 localhost ovs-vsctl[52998]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[53005]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Nov 23 02:54:55 localhost ovs-vsctl[53016]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Nov 23 02:54:55 localhost puppet-user[52641]: Notice: Applied catalog in 0.48 seconds Nov 23 02:54:55 localhost puppet-user[52641]: Application: Nov 23 02:54:55 localhost puppet-user[52641]: Initial environment: production Nov 23 02:54:55 localhost puppet-user[52641]: Converged environment: production Nov 23 02:54:55 localhost puppet-user[52641]: Run mode: user Nov 23 02:54:55 localhost puppet-user[52641]: Changes: Nov 23 02:54:55 localhost puppet-user[52641]: Total: 14 Nov 23 02:54:55 localhost puppet-user[52641]: Events: Nov 23 02:54:55 localhost puppet-user[52641]: Success: 14 Nov 23 02:54:55 localhost puppet-user[52641]: Total: 14 Nov 23 02:54:55 localhost puppet-user[52641]: Resources: Nov 23 02:54:55 localhost puppet-user[52641]: Skipped: 12 Nov 23 02:54:55 localhost puppet-user[52641]: Changed: 14 Nov 23 02:54:55 localhost puppet-user[52641]: Out of sync: 14 Nov 23 02:54:55 localhost puppet-user[52641]: Total: 29 Nov 23 02:54:55 localhost puppet-user[52641]: Time: Nov 23 02:54:55 localhost puppet-user[52641]: Exec: 0.02 Nov 23 02:54:55 localhost puppet-user[52641]: Config retrieval: 0.26 Nov 23 02:54:55 localhost puppet-user[52641]: Vs config: 0.40 Nov 23 02:54:55 localhost puppet-user[52641]: Transaction evaluation: 0.47 Nov 23 02:54:55 localhost puppet-user[52641]: Catalog application: 0.48 Nov 23 02:54:55 localhost puppet-user[52641]: Last run: 1763884495 Nov 23 02:54:55 localhost puppet-user[52641]: Total: 0.48 Nov 23 02:54:55 localhost puppet-user[52641]: Version: Nov 23 02:54:55 localhost puppet-user[52641]: Config: 1763884494 Nov 23 02:54:55 localhost puppet-user[52641]: Puppet: 7.10.0 Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Nov 23 02:54:55 localhost systemd[1]: var-lib-containers-storage-overlay-ce7eb41ed6d76b373fcdd8e0be741f028bada9eb205b4f95d2d01e9a1a9acc01-merged.mount: Deactivated successfully. Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Nov 23 02:54:55 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Nov 23 02:54:55 localhost systemd[1]: libpod-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Deactivated successfully. Nov 23 02:54:55 localhost systemd[1]: libpod-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Consumed 2.855s CPU time. Nov 23 02:54:55 localhost podman[52452]: 2025-11-23 07:54:55.922264747 +0000 UTC m=+3.287431493 container died a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=container-puppet-ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1) Nov 23 02:54:56 localhost podman[52689]: 2025-11-23 07:54:53.14124415 +0000 UTC m=+0.033800895 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 23 02:54:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:56 localhost systemd[1]: var-lib-containers-storage-overlay-e3c84121996d84fbba61539b49066afbbb0cb30cfa3a0df4358555af6b473227-merged.mount: Deactivated successfully. Nov 23 02:54:56 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Nov 23 02:54:56 localhost podman[53059]: 2025-11-23 07:54:56.51457074 +0000 UTC m=+0.580111963 container cleanup a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ovn-controller-container, container_name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 02:54:56 localhost systemd[1]: libpod-conmon-a6bf08a810ac007cd35bbb7486490751b39551c4b8cc79a11dc11759a9947ce8.scope: Deactivated successfully. Nov 23 02:54:56 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 02:54:56 localhost podman[53117]: 2025-11-23 07:54:56.72970117 +0000 UTC m=+0.118863603 container create 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, version=17.1.12, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.buildah.version=1.41.4, container_name=container-puppet-neutron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-server, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-server-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 02:54:56 localhost podman[53117]: 2025-11-23 07:54:56.670697548 +0000 UTC m=+0.059860061 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 23 02:54:56 localhost systemd[1]: Started libpod-conmon-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope. Nov 23 02:54:56 localhost systemd[1]: Started libcrun container. Nov 23 02:54:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 23 02:54:56 localhost podman[53117]: 2025-11-23 07:54:56.800957657 +0000 UTC m=+0.190120090 container init 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2025-11-19T00:23:27Z, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron) Nov 23 02:54:56 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Nov 23 02:54:56 localhost podman[53117]: 2025-11-23 07:54:56.810224274 +0000 UTC m=+0.199386727 container start 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, config_id=tripleo_puppet_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 02:54:56 localhost podman[53117]: 2025-11-23 07:54:56.810521393 +0000 UTC m=+0.199683836 container attach 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, build-date=2025-11-19T00:23:27Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, version=17.1.12, container_name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-server, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4) Nov 23 02:54:56 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Nov 23 02:54:56 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Nov 23 02:54:56 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Nov 23 02:54:57 localhost puppet-user[51787]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98' Nov 23 02:54:57 localhost puppet-user[51787]: Notice: Applied catalog in 4.36 seconds Nov 23 02:54:57 localhost puppet-user[51787]: Application: Nov 23 02:54:57 localhost puppet-user[51787]: Initial environment: production Nov 23 02:54:57 localhost puppet-user[51787]: Converged environment: production Nov 23 02:54:57 localhost puppet-user[51787]: Run mode: user Nov 23 02:54:57 localhost puppet-user[51787]: Changes: Nov 23 02:54:57 localhost puppet-user[51787]: Total: 183 Nov 23 02:54:57 localhost puppet-user[51787]: Events: Nov 23 02:54:57 localhost puppet-user[51787]: Success: 183 Nov 23 02:54:57 localhost puppet-user[51787]: Total: 183 Nov 23 02:54:57 localhost puppet-user[51787]: Resources: Nov 23 02:54:57 localhost puppet-user[51787]: Changed: 183 Nov 23 02:54:57 localhost puppet-user[51787]: Out of sync: 183 Nov 23 02:54:57 localhost puppet-user[51787]: Skipped: 57 Nov 23 02:54:57 localhost puppet-user[51787]: Total: 487 Nov 23 02:54:57 localhost puppet-user[51787]: Time: Nov 23 02:54:57 localhost puppet-user[51787]: Concat file: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: Concat fragment: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: Anchor: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: File line: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: Virtlogd config: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: Virtstoraged config: 0.01 Nov 23 02:54:57 localhost puppet-user[51787]: Exec: 0.01 Nov 23 02:54:57 localhost puppet-user[51787]: Virtqemud config: 0.02 Nov 23 02:54:57 localhost puppet-user[51787]: Virtnodedevd config: 0.02 Nov 23 02:54:57 localhost puppet-user[51787]: Virtsecretd config: 0.02 Nov 23 02:54:57 localhost puppet-user[51787]: File: 0.03 Nov 23 02:54:57 localhost puppet-user[51787]: Package: 0.03 Nov 23 02:54:57 localhost puppet-user[51787]: Virtproxyd config: 0.04 Nov 23 02:54:57 localhost puppet-user[51787]: Augeas: 1.03 Nov 23 02:54:57 localhost puppet-user[51787]: Config retrieval: 1.59 Nov 23 02:54:57 localhost puppet-user[51787]: Last run: 1763884497 Nov 23 02:54:57 localhost puppet-user[51787]: Nova config: 2.94 Nov 23 02:54:57 localhost puppet-user[51787]: Transaction evaluation: 4.34 Nov 23 02:54:57 localhost puppet-user[51787]: Catalog application: 4.36 Nov 23 02:54:57 localhost puppet-user[51787]: Resources: 0.00 Nov 23 02:54:57 localhost puppet-user[51787]: Total: 4.36 Nov 23 02:54:57 localhost puppet-user[51787]: Version: Nov 23 02:54:57 localhost puppet-user[51787]: Config: 1763884491 Nov 23 02:54:57 localhost puppet-user[51787]: Puppet: 7.10.0 Nov 23 02:54:58 localhost puppet-user[53160]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Nov 23 02:54:58 localhost systemd[1]: libpod-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Deactivated successfully. Nov 23 02:54:58 localhost systemd[1]: libpod-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Consumed 8.520s CPU time. Nov 23 02:54:58 localhost puppet-user[53160]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 02:54:58 localhost puppet-user[53160]: (file: /etc/puppet/hiera.yaml) Nov 23 02:54:58 localhost puppet-user[53160]: Warning: Undefined variable '::deploy_config_name'; Nov 23 02:54:58 localhost puppet-user[53160]: (file & line not available) Nov 23 02:54:58 localhost podman[53272]: 2025-11-23 07:54:58.763601109 +0000 UTC m=+0.035563981 container died 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=container-puppet-nova_libvirt, architecture=x86_64, tcib_managed=true) Nov 23 02:54:58 localhost puppet-user[53160]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 02:54:58 localhost puppet-user[53160]: (file & line not available) Nov 23 02:54:58 localhost systemd[1]: tmp-crun.BV9njP.mount: Deactivated successfully. Nov 23 02:54:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8-userdata-shm.mount: Deactivated successfully. Nov 23 02:54:58 localhost puppet-user[53160]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Nov 23 02:54:58 localhost systemd[1]: var-lib-containers-storage-overlay-350340aca58ea5290619b74ac7e03f9daa05fcdf9e02a8c26ea7c6d33f13c398-merged.mount: Deactivated successfully. Nov 23 02:54:58 localhost podman[53272]: 2025-11-23 07:54:58.873838206 +0000 UTC m=+0.145801078 container cleanup 53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vcs-type=git, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 02:54:58 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 02:54:58 localhost systemd[1]: libpod-conmon-53d32d5bd6b43a072a246c5133e1988986a1e91b8ccf6064a1ff1d67b264e1a8.scope: Deactivated successfully. Nov 23 02:54:59 localhost puppet-user[53160]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.63 seconds Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Nov 23 02:54:59 localhost puppet-user[53160]: Notice: Applied catalog in 0.47 seconds Nov 23 02:54:59 localhost puppet-user[53160]: Application: Nov 23 02:54:59 localhost puppet-user[53160]: Initial environment: production Nov 23 02:54:59 localhost puppet-user[53160]: Converged environment: production Nov 23 02:54:59 localhost puppet-user[53160]: Run mode: user Nov 23 02:54:59 localhost puppet-user[53160]: Changes: Nov 23 02:54:59 localhost puppet-user[53160]: Total: 33 Nov 23 02:54:59 localhost puppet-user[53160]: Events: Nov 23 02:54:59 localhost puppet-user[53160]: Success: 33 Nov 23 02:54:59 localhost puppet-user[53160]: Total: 33 Nov 23 02:54:59 localhost puppet-user[53160]: Resources: Nov 23 02:54:59 localhost puppet-user[53160]: Skipped: 21 Nov 23 02:54:59 localhost puppet-user[53160]: Changed: 33 Nov 23 02:54:59 localhost puppet-user[53160]: Out of sync: 33 Nov 23 02:54:59 localhost puppet-user[53160]: Total: 155 Nov 23 02:54:59 localhost puppet-user[53160]: Time: Nov 23 02:54:59 localhost puppet-user[53160]: Resources: 0.00 Nov 23 02:54:59 localhost puppet-user[53160]: Ovn metadata agent config: 0.02 Nov 23 02:54:59 localhost puppet-user[53160]: Neutron config: 0.37 Nov 23 02:54:59 localhost puppet-user[53160]: Transaction evaluation: 0.46 Nov 23 02:54:59 localhost puppet-user[53160]: Catalog application: 0.47 Nov 23 02:54:59 localhost puppet-user[53160]: Config retrieval: 0.70 Nov 23 02:54:59 localhost puppet-user[53160]: Last run: 1763884499 Nov 23 02:54:59 localhost puppet-user[53160]: Total: 0.47 Nov 23 02:54:59 localhost puppet-user[53160]: Version: Nov 23 02:54:59 localhost puppet-user[53160]: Config: 1763884498 Nov 23 02:54:59 localhost puppet-user[53160]: Puppet: 7.10.0 Nov 23 02:55:00 localhost systemd[1]: libpod-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Deactivated successfully. Nov 23 02:55:00 localhost systemd[1]: libpod-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Consumed 3.554s CPU time. Nov 23 02:55:00 localhost podman[53117]: 2025-11-23 07:55:00.431137706 +0000 UTC m=+3.820300219 container died 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=container-puppet-neutron, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 23 02:55:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7-userdata-shm.mount: Deactivated successfully. Nov 23 02:55:00 localhost systemd[1]: var-lib-containers-storage-overlay-8fb9794be20062e7c83c2fae0453c388f7c913c741b51da81b575b1ef1299ce0-merged.mount: Deactivated successfully. Nov 23 02:55:00 localhost podman[53343]: 2025-11-23 07:55:00.538754558 +0000 UTC m=+0.101810587 container cleanup 42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, maintainer=OpenStack TripleO Team, container_name=container-puppet-neutron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:23:27Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.openshift.expose-services=) Nov 23 02:55:00 localhost systemd[1]: libpod-conmon-42ca16d51b25e13a6f47e5f10cc540b34308863546459a51547b5ddf0edd5fc7.scope: Deactivated successfully. Nov 23 02:55:00 localhost python3[51367]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005532585 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005532585', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 23 02:55:01 localhost python3[53396]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:02 localhost python3[53428]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:55:03 localhost python3[53478]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:03 localhost python3[53521]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884502.7450478-84577-143656042359276/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:03 localhost python3[53583]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:04 localhost python3[53626]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884503.6684618-84577-135099077175030/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:04 localhost python3[53688]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:05 localhost python3[53731]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884504.5930176-84688-263958110146196/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:05 localhost python3[53793]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:06 localhost python3[53836]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884505.4656067-84750-275205959474268/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:06 localhost python3[53866]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:55:06 localhost systemd[1]: Reloading. Nov 23 02:55:06 localhost systemd-rc-local-generator[53891]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:06 localhost systemd-sysv-generator[53894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:07 localhost systemd[1]: Reloading. Nov 23 02:55:07 localhost systemd-rc-local-generator[53925]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:07 localhost systemd-sysv-generator[53929]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:07 localhost systemd[1]: Starting TripleO Container Shutdown... Nov 23 02:55:07 localhost systemd[1]: Finished TripleO Container Shutdown. Nov 23 02:55:07 localhost python3[53990]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:08 localhost python3[54033]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884507.3838098-84793-279804636602052/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:08 localhost python3[54095]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 02:55:09 localhost python3[54138]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884508.3237157-84818-36849067434011/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:09 localhost python3[54168]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:55:09 localhost systemd[1]: Reloading. Nov 23 02:55:09 localhost systemd-rc-local-generator[54193]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:09 localhost systemd-sysv-generator[54198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:09 localhost systemd[1]: Reloading. Nov 23 02:55:09 localhost systemd-rc-local-generator[54230]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:09 localhost systemd-sysv-generator[54235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:10 localhost systemd[1]: Starting Create netns directory... Nov 23 02:55:10 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 02:55:10 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 02:55:10 localhost systemd[1]: Finished Create netns directory. Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 64da22351939caf7431a331d2f0c888a Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 4767aaabc3de112d8791c290aa2b669d Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 67452ffc3d9e727585009ffc9989a224 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 7238f2997345c97f4c6ab424e622dc1b Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 1bd1f352f264f24512a1a2440e47a1f5 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 1bd1f352f264f24512a1a2440e47a1f5 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: a43bf0e2ecc9c9d02be7a27eac338b4c Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:10 localhost python3[54260]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 39370c45b6a27bfda1ebe1fb9d328c43 Nov 23 02:55:12 localhost python3[54318]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.555873584 +0000 UTC m=+0.064872692 container create 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 02:55:12 localhost systemd[1]: Started libpod-conmon-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope. Nov 23 02:55:12 localhost systemd[1]: Started libcrun container. Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.521148939 +0000 UTC m=+0.030148037 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:55:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.630225719 +0000 UTC m=+0.139224817 container init 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.641925624 +0000 UTC m=+0.150924732 container start 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4) Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.642178473 +0000 UTC m=+0.151177581 container attach 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, config_id=tripleo_step1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 02:55:12 localhost systemd[1]: libpod-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope: Deactivated successfully. Nov 23 02:55:12 localhost podman[54355]: 2025-11-23 07:55:12.649567089 +0000 UTC m=+0.158566187 container died 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:55:12 localhost podman[54374]: 2025-11-23 07:55:12.734731632 +0000 UTC m=+0.075311757 container cleanup 28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr_init_logs, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 23 02:55:12 localhost systemd[1]: libpod-conmon-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d.scope: Deactivated successfully. Nov 23 02:55:12 localhost python3[54318]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Nov 23 02:55:13 localhost podman[54451]: 2025-11-23 07:55:13.159655873 +0000 UTC m=+0.080072159 container create 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 23 02:55:13 localhost systemd[1]: Started libpod-conmon-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope. Nov 23 02:55:13 localhost systemd[1]: Started libcrun container. Nov 23 02:55:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 23 02:55:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 23 02:55:13 localhost podman[54451]: 2025-11-23 07:55:13.12372727 +0000 UTC m=+0.044143556 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:55:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:55:13 localhost podman[54451]: 2025-11-23 07:55:13.243290826 +0000 UTC m=+0.163707162 container init 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, name=rhosp17/openstack-qdrouterd) Nov 23 02:55:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:55:13 localhost podman[54451]: 2025-11-23 07:55:13.276239443 +0000 UTC m=+0.196655719 container start 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:55:13 localhost python3[54318]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=64da22351939caf7431a331d2f0c888a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 23 02:55:13 localhost podman[54472]: 2025-11-23 07:55:13.356195858 +0000 UTC m=+0.068943682 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:55:13 localhost systemd[1]: var-lib-containers-storage-overlay-102a6fceace8ab36236a8b21c9b9829f6b100c83ca8f9b2c07c3282080c29222-merged.mount: Deactivated successfully. Nov 23 02:55:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28267775de5b6b21429984e4357727c995d4d2166d0d520b8aeff422d528523d-userdata-shm.mount: Deactivated successfully. Nov 23 02:55:13 localhost podman[54472]: 2025-11-23 07:55:13.57225775 +0000 UTC m=+0.285005504 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z) Nov 23 02:55:13 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:55:13 localhost python3[54548]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:14 localhost python3[54564]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 02:55:14 localhost python3[54625]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884514.179765-85027-156441799830369/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:15 localhost python3[54641]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 02:55:15 localhost systemd[1]: Reloading. Nov 23 02:55:15 localhost systemd-sysv-generator[54668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:15 localhost systemd-rc-local-generator[54664]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:15 localhost python3[54692]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 02:55:15 localhost systemd[1]: Reloading. Nov 23 02:55:16 localhost systemd-rc-local-generator[54725]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 02:55:16 localhost systemd-sysv-generator[54729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 02:55:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 02:55:16 localhost systemd[1]: Starting metrics_qdr container... Nov 23 02:55:16 localhost systemd[1]: Started metrics_qdr container. Nov 23 02:55:16 localhost python3[54773]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:18 localhost python3[54894]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005532585 step=1 update_config_hash_only=False Nov 23 02:55:19 localhost python3[54910]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 02:55:19 localhost python3[54926]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 02:55:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:55:44 localhost systemd[1]: tmp-crun.OUOSvj.mount: Deactivated successfully. Nov 23 02:55:44 localhost podman[54927]: 2025-11-23 07:55:44.036336094 +0000 UTC m=+0.091849698 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:49:46Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, version=17.1.12, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 02:55:44 localhost podman[54927]: 2025-11-23 07:55:44.195644645 +0000 UTC m=+0.251158329 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:55:44 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:56:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:56:15 localhost podman[55032]: 2025-11-23 07:56:15.028368736 +0000 UTC m=+0.083855551 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:56:15 localhost podman[55032]: 2025-11-23 07:56:15.221230644 +0000 UTC m=+0.276717419 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 02:56:15 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:56:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:56:46 localhost podman[55060]: 2025-11-23 07:56:46.021331657 +0000 UTC m=+0.080689729 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Nov 23 02:56:46 localhost podman[55060]: 2025-11-23 07:56:46.224205124 +0000 UTC m=+0.283563206 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 02:56:46 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:57:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:57:17 localhost podman[55166]: 2025-11-23 07:57:17.036821229 +0000 UTC m=+0.079452579 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Nov 23 02:57:17 localhost podman[55166]: 2025-11-23 07:57:17.208407241 +0000 UTC m=+0.251038601 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 02:57:17 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:57:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:57:48 localhost podman[55196]: 2025-11-23 07:57:48.018527678 +0000 UTC m=+0.072114660 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 02:57:48 localhost podman[55196]: 2025-11-23 07:57:48.210309721 +0000 UTC m=+0.263896733 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 02:57:48 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:58:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:58:19 localhost podman[55302]: 2025-11-23 07:58:19.022254565 +0000 UTC m=+0.079031677 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, release=1761123044, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 23 02:58:19 localhost podman[55302]: 2025-11-23 07:58:19.214091078 +0000 UTC m=+0.270868130 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr) Nov 23 02:58:19 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:58:25 localhost sshd[55331]: main: sshd: ssh-rsa algorithm is disabled Nov 23 02:58:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:58:50 localhost systemd[1]: tmp-crun.IXHW85.mount: Deactivated successfully. Nov 23 02:58:50 localhost podman[55333]: 2025-11-23 07:58:50.016920475 +0000 UTC m=+0.077552253 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git) Nov 23 02:58:50 localhost podman[55333]: 2025-11-23 07:58:50.231359482 +0000 UTC m=+0.291991220 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 02:58:50 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:59:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:59:21 localhost systemd[1]: tmp-crun.IqKNu7.mount: Deactivated successfully. Nov 23 02:59:21 localhost podman[55439]: 2025-11-23 07:59:21.021402781 +0000 UTC m=+0.081961362 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 23 02:59:21 localhost podman[55439]: 2025-11-23 07:59:21.240118014 +0000 UTC m=+0.300676575 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git) Nov 23 02:59:21 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 02:59:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 02:59:52 localhost systemd[1]: tmp-crun.fhscWl.mount: Deactivated successfully. Nov 23 02:59:52 localhost podman[55469]: 2025-11-23 07:59:52.012752839 +0000 UTC m=+0.068946119 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Nov 23 02:59:52 localhost podman[55469]: 2025-11-23 07:59:52.228979209 +0000 UTC m=+0.285172439 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 23 02:59:52 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:00:01 localhost ceph-osd[32858]: osd.3 pg_epoch: 19 pg[2.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2,1,3] r=2 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:04 localhost ceph-osd[31905]: osd.0 pg_epoch: 21 pg[3.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [1,2,0] r=2 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:04 localhost ceph-osd[32858]: osd.3 pg_epoch: 23 pg[4.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [3,5,1] r=0 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:05 localhost ceph-osd[32858]: osd.3 pg_epoch: 24 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [3,5,1] r=0 lpr=23 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:07 localhost ceph-osd[32858]: osd.3 pg_epoch: 25 pg[5.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [4,3,2] r=1 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:00:23 localhost podman[55576]: 2025-11-23 08:00:23.017845811 +0000 UTC m=+0.079096078 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:00:23 localhost ceph-osd[31905]: osd.0 pg_epoch: 31 pg[6.0( empty local-lis/les=0/0 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [0,5,1] r=0 lpr=31 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:23 localhost podman[55576]: 2025-11-23 08:00:23.239622815 +0000 UTC m=+0.300873032 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:00:23 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:00:23 localhost ceph-osd[31905]: osd.0 pg_epoch: 32 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=0/0 les/c/f=0/0/0 sis=31) [0,5,1] r=0 lpr=31 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:25 localhost ceph-osd[32858]: osd.3 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [5,1,3] r=2 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:33 localhost ceph-osd[32858]: osd.3 pg_epoch: 38 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=38 pruub=8.044471741s) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 active pruub 1116.652099609s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,3], acting [2,1,3] -> [2,1,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:33 localhost ceph-osd[32858]: osd.3 pg_epoch: 38 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=38 pruub=8.041690826s) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1116.652099609s@ mbc={}] state: transitioning to Stray Nov 23 03:00:33 localhost ceph-osd[31905]: osd.0 pg_epoch: 38 pg[3.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=38 pruub=10.719461441s) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 active pruub 1123.247802734s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,0], acting [1,2,0] -> [1,2,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:33 localhost ceph-osd[31905]: osd.0 pg_epoch: 38 pg[3.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=38 pruub=10.716773033s) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.247802734s@ mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[32858]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=19/20 n=0 ec=38/19 lis/c=19/19 les/c/f=20/20/0 sis=38) [2,1,3] r=2 lpr=38 pi=[19,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.11( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:34 localhost ceph-osd[31905]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=21/22 n=0 ec=38/21 lis/c=21/21 les/c/f=22/22/0 sis=38) [1,2,0] r=2 lpr=38 pi=[21,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:35 localhost ceph-osd[32858]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=40 pruub=12.422865868s) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 active pruub 1123.047607422s@ mbc={}] start_peering_interval up [4,3,2] -> [4,3,2], acting [4,3,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:35 localhost ceph-osd[32858]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40 pruub=10.230877876s) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active pruub 1120.855712891s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:35 localhost ceph-osd[32858]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=40 pruub=12.419507980s) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.047607422s@ mbc={}] state: transitioning to Stray Nov 23 03:00:35 localhost ceph-osd[32858]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40 pruub=10.230877876s) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1120.855712891s@ mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.11( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=25/26 n=0 ec=40/25 lis/c=25/25 les/c/f=26/26/0 sis=40) [4,3,2] r=1 lpr=40 pi=[25,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=23/24 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.0( empty local-lis/les=40/41 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=23/23 les/c/f=24/24/0 sis=40) [3,5,1] r=0 lpr=40 pi=[23,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:36 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.0 scrub starts Nov 23 03:00:36 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.0 scrub ok Nov 23 03:00:37 localhost ceph-osd[32858]: osd.3 pg_epoch: 42 pg[7.0( v 35'39 (0'0,35'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=12.373711586s) [5,1,3] r=2 lpr=42 pi=[33,42)/1 luod=0'0 lua=35'37 crt=35'39 lcod 35'38 mlcod 0'0 active pruub 1125.082519531s@ mbc={}] start_peering_interval up [5,1,3] -> [5,1,3], acting [5,1,3] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:37 localhost ceph-osd[32858]: osd.3 pg_epoch: 42 pg[7.0( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=12.371913910s) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 lcod 35'38 mlcod 0'0 unknown NOTIFY pruub 1125.082519531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:37 localhost ceph-osd[31905]: osd.0 pg_epoch: 42 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42 pruub=10.341920853s) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active pruub 1126.960327148s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,1], acting [0,5,1] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:37 localhost ceph-osd[31905]: osd.0 pg_epoch: 42 pg[6.0( empty local-lis/les=31/32 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42 pruub=10.341920853s) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown pruub 1126.960327148s@ mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1f( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1e( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1d( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.12( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1c( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.13( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.10( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.11( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.16( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.15( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.14( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.b( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.17( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.9( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.4( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.6( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.c( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.f( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.19( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.18( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1b( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1a( empty local-lis/les=31/32 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.b( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.a( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.8( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.9( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.e( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.f( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.c( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.6( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.5( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.3( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.2( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.4( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.7( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[32858]: osd.3 pg_epoch: 43 pg[7.d( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,1,3] r=2 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=42/43 n=0 ec=31/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:38 localhost ceph-osd[31905]: osd.0 pg_epoch: 43 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=31/31 les/c/f=32/32/0 sis=42) [0,5,1] r=0 lpr=42 pi=[31,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:40 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.0 scrub starts Nov 23 03:00:40 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.0 scrub ok Nov 23 03:00:42 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.e scrub starts Nov 23 03:00:42 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.e scrub ok Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835800171s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540893555s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,3], acting [1,2,0] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937299728s) [1,3,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642456055s@ mbc={}] start_peering_interval up [0,5,1] -> [1,3,5], acting [0,5,1] -> [1,3,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835736275s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541015625s@ mbc={}] start_peering_interval up [1,2,0] -> [4,3,2], acting [1,2,0] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.938068390s) [4,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [4,5,3], acting [0,5,1] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835564613s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.540893555s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.835627556s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541015625s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937961578s) [4,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834791183s) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540405273s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,1], acting [1,2,0] -> [0,2,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834791183s) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.540405273s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942682266s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936487198s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642700195s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942566872s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936418533s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642700195s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834320068s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,5], acting [1,2,0] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.936368942s) [1,3,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642456055s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834265709s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.540649414s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.845931053s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.845931053s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.552246094s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934622765s) [4,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [4,2,0], acting [0,5,1] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934531212s) [4,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934263229s) [3,4,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,2], acting [0,5,1] -> [3,4,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834345818s) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541259766s@ mbc={}] start_peering_interval up [1,2,0] -> [2,4,0], acting [1,2,0] -> [2,4,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834661484s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541503906s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,4], acting [1,2,0] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934223175s) [3,4,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834317207s) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541259766s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834564209s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541503906s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942143440s) [3,1,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.649047852s@ mbc={}] start_peering_interval up [0,5,1] -> [3,1,2], acting [0,5,1] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834403992s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541381836s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,1], acting [1,2,0] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934181213s) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641235352s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,4], acting [0,5,1] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834381104s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541381836s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934181213s) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641235352s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.942028999s) [3,1,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.649047852s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833991051s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541259766s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934215546s) [1,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [1,0,2], acting [0,5,1] -> [1,0,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933979988s) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,2], acting [0,5,1] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934185028s) [1,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641479492s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833991051s) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.541259766s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933952332s) [3,5,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641357422s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,4], acting [0,5,1] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834877014s) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542358398s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,0], acting [1,2,0] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933979988s) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641113281s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933888435s) [3,5,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641357422s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834819794s) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542358398s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834176064s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541625977s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934815407s) [2,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642333984s@ mbc={}] start_peering_interval up [0,5,1] -> [2,4,0], acting [0,5,1] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834150314s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541625977s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834087372s) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [2,1,0], acting [1,2,0] -> [2,1,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934787750s) [2,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642333984s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834007263s) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833943367s) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [1,5,0], acting [1,2,0] -> [1,5,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933747292s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641601562s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833916664s) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934185028s) [5,0,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642089844s@ mbc={}] start_peering_interval up [0,5,1] -> [5,0,4], acting [0,5,1] -> [5,0,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933710098s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641601562s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833460808s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.541748047s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.834133148s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542114258s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.934127808s) [5,0,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642089844s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833368301s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.541748047s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933013916s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641601562s@ mbc={}] start_peering_interval up [0,5,1] -> [2,1,3], acting [0,5,1] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932913780s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641601562s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933200836s) [5,1,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642089844s@ mbc={}] start_peering_interval up [0,5,1] -> [5,1,0], acting [0,5,1] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932395935s) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,5], acting [0,5,1] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933135986s) [5,1,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642089844s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932395935s) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641479492s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833307266s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,4], acting [1,2,0] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841951370s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832960129s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542358398s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933698654s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643066406s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833233833s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542602539s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.833282471s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542114258s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841876030s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933634758s) [3,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643066406s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832960129s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.542358398s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933800697s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643310547s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933767319s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643310547s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841523170s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832893372s) [2,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [2,0,4], acting [1,2,0] -> [2,0,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933731079s) [1,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [1,5,3], acting [0,5,1] -> [1,5,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933674812s) [3,5,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,1], acting [0,5,1] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933703423s) [1,5,3] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832807541s) [2,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.542602539s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.933616638s) [3,5,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.643432617s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832571030s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.542602539s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932530403s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642456055s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.832571030s) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.542602539s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841611862s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551635742s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932500839s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642456055s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931669235s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641845703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931645393s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641845703s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841684341s) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552001953s@ mbc={}] start_peering_interval up [1,2,0] -> [4,0,5], acting [1,2,0] -> [4,0,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840876579s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841662407s) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552001953s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841556549s) [3,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551635742s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932093620s) [3,2,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.642700195s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,4], acting [0,5,1] -> [3,2,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840692520s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932069778s) [3,2,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.642700195s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.931115150s) [2,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641723633s@ mbc={}] start_peering_interval up [0,5,1] -> [2,3,1], acting [0,5,1] -> [2,3,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840462685s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551269531s@ mbc={}] start_peering_interval up [1,2,0] -> [4,2,3], acting [1,2,0] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840587616s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930770874s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641479492s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841069221s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551879883s@ mbc={}] start_peering_interval up [1,2,0] -> [3,5,1], acting [1,2,0] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930669785s) [3,4,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641479492s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840985298s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551879883s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930966377s) [2,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641723633s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840362549s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551269531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930923462s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641845703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841176033s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.930890083s) [5,3,4] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641845703s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.841149330s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552246094s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937814713s) [5,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648925781s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,1], acting [0,5,1] -> [5,3,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840806961s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552001953s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,3], acting [1,2,0] -> [1,2,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937576294s) [1,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [1,2,0], acting [0,5,1] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840649605s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.551879883s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937548637s) [1,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932242393s) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.643432617s@ mbc={}] start_peering_interval up [0,5,1] -> [0,2,4], acting [0,5,1] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840758324s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.552001953s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840600014s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.551879883s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937674522s) [5,3,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648925781s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.932242393s) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.643432617s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840812683s) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1137.552246094s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,4], acting [1,2,0] -> [0,5,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.840812683s) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.552246094s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937287331s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.648803711s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.937143326s) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.648803711s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1e( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.a( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.5( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.18( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837604523s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.648315430s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,3], acting [4,3,2] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.851902008s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [0,5,1], acting [3,5,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.851874352s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828904152s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,3], acting [2,1,3] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828863144s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849290848s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849265099s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.850394249s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661621094s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.850394249s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.661621094s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845753670s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.657104492s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,2], acting [3,5,1] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828304291s) [5,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,0], acting [2,1,3] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828281403s) [5,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845753670s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.657104492s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847958565s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659545898s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847936630s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659545898s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828187943s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.828160286s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849582672s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661254883s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.849558830s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661254883s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847456932s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659301758s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.827984810s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,0], acting [2,1,3] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.827827454s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847299576s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659301758s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837570190s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.648315430s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844738007s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844702721s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656982422s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847362518s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659301758s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.1e( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846982956s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659423828s@ mbc={}] start_peering_interval up [4,3,2] -> [3,4,5], acting [4,3,2] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846982956s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659423828s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826711655s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639404297s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826678276s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639404297s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847678185s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826443672s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639282227s@ mbc={}] start_peering_interval up [2,1,3] -> [4,3,2], acting [2,1,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847647667s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.12( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826419830s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639282227s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846818924s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659301758s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848485947s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661376953s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848455429s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661376953s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847059250s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660034180s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847031593s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660034180s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843997955s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.657104492s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826216698s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639404297s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,4], acting [2,1,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843966484s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.657104492s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.826193810s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639404297s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.13( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845967293s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659423828s@ mbc={}] start_peering_interval up [4,3,2] -> [3,1,5], acting [4,3,2] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845967293s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659423828s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825808525s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639526367s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848221779s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661987305s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824745178s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.638427734s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,5], acting [2,1,3] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843224525s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,4], acting [3,5,1] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848189354s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661987305s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824672699s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.638427734s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843224525s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.656982422s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825201988s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639282227s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825174332s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639282227s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925047874s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.739135742s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925019264s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.739135742s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846148491s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [5,4,0], acting [4,3,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824141502s) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.638427734s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,2], acting [2,1,3] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824141502s) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.638427734s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846114159s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.16( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845313072s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660034180s@ mbc={}] start_peering_interval up [4,3,2] -> [1,0,5], acting [4,3,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.845283508s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660034180s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848176956s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662841797s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.825737000s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639526367s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848143578s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662841797s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844613075s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659545898s@ mbc={}] start_peering_interval up [4,3,2] -> [0,1,2], acting [4,3,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844579697s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.659545898s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846802711s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [1,0,2], acting [3,5,1] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.846767426s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848936081s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.664062500s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,4], acting [3,5,1] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.848896980s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.664062500s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925550461s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740722656s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.925510406s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740722656s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.11( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816704750s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.632202148s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.19( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816674232s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.632202148s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823850632s) [1,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639648438s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,5], acting [2,1,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844244003s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660156250s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823821068s) [1,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639648438s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847523689s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.663452148s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,1], acting [3,5,1] -> [2,0,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.844212532s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660156250s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.847485542s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.663452148s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823812485s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639770508s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,0], acting [2,1,3] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843764305s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.659790039s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840842247s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823677063s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639770508s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922594070s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738769531s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840815544s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922490120s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.738769531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843998909s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [4,5,0], acting [4,3,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823447227s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.639892578s@ mbc={}] start_peering_interval up [2,1,3] -> [1,3,2], acting [2,1,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843973160s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.823421478s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.639892578s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.14( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824110985s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [3,5,4], acting [2,1,3] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840005875s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656982422s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843177795s) [4,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660156250s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,2], acting [4,3,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.824110985s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641113281s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840005875s) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.656982422s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843148232s) [4,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660156250s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839732170s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,1], acting [3,5,1] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839672089s) [5,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.b( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.b( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842562675s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660400391s@ mbc={}] start_peering_interval up [4,3,2] -> [1,3,5], acting [4,3,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.822620392s) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640502930s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,1], acting [2,1,3] -> [5,3,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842535019s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660400391s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.822591782s) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640502930s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.3( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922136307s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740722656s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843249321s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,0], acting [3,5,1] -> [5,1,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922107697s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740722656s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.2( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843209267s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841743469s) [5,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,4], acting [4,3,2] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841720581s) [5,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843566895s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662475586s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922024727s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740844727s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843536377s) [4,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662475586s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841497421s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660522461s@ mbc={}] start_peering_interval up [4,3,2] -> [2,3,1], acting [4,3,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.4( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.922002792s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740844727s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841475487s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660522461s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.821928024s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641357422s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.821928024s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.641357422s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820111275s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842794418s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662597656s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,0], acting [3,5,1] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820067406s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640014648s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.6( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842749596s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662597656s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.843764305s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.659790039s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840354919s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [5,0,1], acting [4,3,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820435524s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640869141s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840326309s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842167854s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662597656s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.820395470s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640869141s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842133522s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662597656s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.4( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.819142342s) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,5], acting [2,1,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839834213s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [3,5,4], acting [4,3,2] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.842031479s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662963867s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.819142342s) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640014648s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839834213s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.660766602s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.841982841s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662963867s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.919782639s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740966797s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839818001s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661132812s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,1], acting [4,3,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.919662476s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740966797s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839791298s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661132812s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818497658s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [5,0,4], acting [2,1,3] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818475723s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640014648s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840746880s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662353516s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840718269s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662353516s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1f( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839011192s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838981628s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818644524s) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640502930s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,2], acting [2,1,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.918968201s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.740844727s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.916552544s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738525391s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818644524s) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640502930s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.918909073s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.740844727s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.7( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838333130s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660644531s@ mbc={}] start_peering_interval up [4,3,2] -> [5,1,3], acting [4,3,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838297844s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660644531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834403992s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656494141s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818717957s) [2,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641235352s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,0], acting [2,1,3] -> [2,1,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=10.916527748s) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1129.738525391s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834202766s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.656860352s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.834152222s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656860352s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.840023041s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839991570s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817165375s) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640014648s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817165375s) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640014648s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837698936s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837549210s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660644531s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,4], acting [4,3,2] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839958191s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662963867s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839901924s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662963867s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.818294525s) [2,1,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641235352s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837414742s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660644531s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817717552s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,3], acting [2,1,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817695618s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837395668s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817572594s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.641113281s@ mbc={}] start_peering_interval up [2,1,3] -> [1,2,3], acting [2,1,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839689255s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.663085938s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837361336s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.817535400s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.641113281s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.839632034s) [4,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.663085938s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837226868s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [2,0,4], acting [4,3,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838188171s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661987305s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816642761s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640380859s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838162422s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661987305s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837698936s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1127.660766602s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816599846s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640380859s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836881638s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660766602s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836853981s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660766602s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838805199s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.662719727s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836980820s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.838712692s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.662719727s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836898804s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661010742s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836870193s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661010742s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816055298s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640258789s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837521553s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661743164s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816055298s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.640258789s@ mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837487221s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661743164s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836463928s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.660888672s@ mbc={}] start_peering_interval up [4,3,2] -> [2,1,3], acting [4,3,2] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.836436272s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.660888672s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837371826s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1127.661865234s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.837341309s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.661865234s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.815620422s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640136719s@ mbc={}] start_peering_interval up [2,1,3] -> [0,2,4], acting [2,1,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816117287s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1133.640869141s@ mbc={}] start_peering_interval up [2,1,3] -> [4,5,0], acting [2,1,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:43 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.4( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.816083908s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640869141s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.7( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=14.815547943s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.640136719s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.c( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.b( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=8.831289291s) [1,5,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.656494141s@ mbc={}] state: transitioning to Stray Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.f( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:43 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1e( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.16( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1b( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1e( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1c( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1b( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.1c( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.15( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.a( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.8( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.c( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.9( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.e( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.c( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.5( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.7( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.2( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.16( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1c( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.b( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.5( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.19( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.d( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.1( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.10( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.14( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.13( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.e( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1c( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1d( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.1d( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.d( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.17( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.b( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.13( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.1e( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.12( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.1( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.3( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.8( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.f( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.19( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[4.a( empty local-lis/les=0/0 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.8( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.1f( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[2.14( empty local-lis/les=0/0 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.11( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 44 pg[5.10( empty local-lis/les=0/0 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.1e( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.19( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.12( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.16( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.a( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.7( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.b( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.1( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.17( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.b( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.6( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.6( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.18( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.1e( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[4.4( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[5.5( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[3.6( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[2.1f( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[31905]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.b( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.7( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.11( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,1,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.4( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.4( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.4( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.9( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 44 pg[3.1a( empty local-lis/les=0/0 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.f( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.14( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.13( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.c( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.2( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.1d( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[3.1e( empty local-lis/les=44/45 n=0 ec=38/21 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[6.1f( empty local-lis/les=44/45 n=0 ec=42/31 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.1d( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.19( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.6( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.f( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.1( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.c( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.e( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.10( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.17( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[2.9( empty local-lis/les=44/45 n=0 ec=38/19 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[5.14( empty local-lis/les=44/45 n=0 ec=40/25 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:44 localhost ceph-osd[32858]: osd.3 pg_epoch: 45 pg[4.11( empty local-lis/les=44/45 n=0 ec=40/23 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886891365s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886891365s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738647461s@ mbc={}] state: transitioning to Primary Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886839867s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886839867s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738647461s@ mbc={}] state: transitioning to Primary Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886330605s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738891602s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.886330605s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738891602s@ mbc={}] state: transitioning to Primary Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.885553360s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1129.738525391s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:45 localhost ceph-osd[32858]: osd.3 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=8.885553360s) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1129.738525391s@ mbc={}] state: transitioning to Primary Nov 23 03:00:46 localhost ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:46 localhost ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:46 localhost ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:46 localhost ceph-osd[32858]: osd.3 pg_epoch: 47 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=0 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:00:48 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub starts Nov 23 03:00:50 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub starts Nov 23 03:00:52 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.6 deep-scrub starts Nov 23 03:00:52 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub starts Nov 23 03:00:53 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e deep-scrub starts Nov 23 03:00:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:00:54 localhost podman[55651]: 2025-11-23 08:00:54.002010845 +0000 UTC m=+0.065936750 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd) Nov 23 03:00:54 localhost podman[55651]: 2025-11-23 08:00:54.226085776 +0000 UTC m=+0.290011701 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 23 03:00:54 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:00:54 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.6 deep-scrub ok Nov 23 03:00:55 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts Nov 23 03:00:55 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.981581688s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.866577148s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.981581688s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.866577148s@ mbc={}] state: transitioning to Primary Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.978737831s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.864257812s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.978737831s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.864257812s@ mbc={}] state: transitioning to Primary Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980515480s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.866577148s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980515480s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.866577148s@ mbc={}] state: transitioning to Primary Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980977058s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1143.867309570s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:00:55 localhost ceph-osd[32858]: osd.3 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48 pruub=12.980977058s) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1143.867309570s@ mbc={}] state: transitioning to Primary Nov 23 03:00:56 localhost ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:00:56 localhost ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=48/49 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Nov 23 03:00:56 localhost ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:00:56 localhost ceph-osd[32858]: osd.3 pg_epoch: 49 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=48) [3,2,4] r=0 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=3}}] state: react AllReplicasActivated Activating complete Nov 23 03:00:57 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.4 scrub starts Nov 23 03:00:57 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=193.163.125.189 DST=38.102.83.198 LEN=44 TOS=0x00 PREC=0x00 TTL=244 ID=10013 PROTO=TCP SPT=34503 DPT=9090 SEQ=2446406591 ACK=0 WINDOW=14600 RES=0x00 SYN URGP=0 OPT (020405B4) Nov 23 03:00:58 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.4 scrub ok Nov 23 03:00:58 localhost python3[55696]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:00:59 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.12 scrub starts Nov 23 03:00:59 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.12 scrub ok Nov 23 03:00:59 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.4 scrub starts Nov 23 03:00:59 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.4 scrub ok Nov 23 03:01:00 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.16 scrub starts Nov 23 03:01:00 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.16 scrub ok Nov 23 03:01:00 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.1d scrub starts Nov 23 03:01:00 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.1d scrub ok Nov 23 03:01:00 localhost python3[55712]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:02 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.7 scrub starts Nov 23 03:01:02 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.7 scrub ok Nov 23 03:01:02 localhost python3[55739]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:02 localhost ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.724631310s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.739013672s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:02 localhost ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.724288940s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.739013672s@ mbc={}] state: transitioning to Stray Nov 23 03:01:02 localhost ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.723724365s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.738647461s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:02 localhost ceph-osd[32858]: osd.3 pg_epoch: 50 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.723650932s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.738647461s@ mbc={}] state: transitioning to Stray Nov 23 03:01:02 localhost ceph-osd[31905]: osd.0 pg_epoch: 50 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:02 localhost ceph-osd[31905]: osd.0 pg_epoch: 50 pg[7.4( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:03 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.17 scrub starts Nov 23 03:01:03 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.17 scrub ok Nov 23 03:01:03 localhost ceph-osd[31905]: osd.0 pg_epoch: 51 pg[7.c( v 35'39 lc 35'16 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:03 localhost ceph-osd[31905]: osd.0 pg_epoch: 51 pg[7.4( v 35'39 lc 35'15 (0'0,35'39] local-lis/les=50/51 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=0 lpr=50 pi=[42,50)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(1+2)=4}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:04 localhost ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803336143s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1151.865112305s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:04 localhost ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803246498s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.865112305s@ mbc={}] state: transitioning to Stray Nov 23 03:01:04 localhost ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803723335s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1151.866088867s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:04 localhost ceph-osd[32858]: osd.3 pg_epoch: 52 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52 pruub=11.803501129s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.866088867s@ mbc={}] state: transitioning to Stray Nov 23 03:01:05 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.10 deep-scrub starts Nov 23 03:01:05 localhost python3[55787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:05 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.10 deep-scrub ok Nov 23 03:01:05 localhost python3[55830]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884864.877191-92282-189973165291993/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=5f137984986c8cf5df5aec7749430e0dc129d0db backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:05 localhost ceph-osd[31905]: osd.0 pg_epoch: 52 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52) [4,0,2] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:01:05 localhost ceph-osd[31905]: osd.0 pg_epoch: 52 pg[7.5( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/47/0 sis=52) [4,0,2] r=1 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:01:06 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.17 scrub starts Nov 23 03:01:06 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.17 scrub ok Nov 23 03:01:07 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1f deep-scrub starts Nov 23 03:01:07 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1f deep-scrub ok Nov 23 03:01:07 localhost ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.279025078s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1153.944335938s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:07 localhost ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.278964996s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1153.944335938s@ mbc={}] state: transitioning to Stray Nov 23 03:01:07 localhost ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.355095863s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1154.021118164s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:07 localhost ceph-osd[32858]: osd.3 pg_epoch: 54 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=11.354991913s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1154.021118164s@ mbc={}] state: transitioning to Stray Nov 23 03:01:07 localhost ceph-osd[31905]: osd.0 pg_epoch: 54 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:07 localhost ceph-osd[31905]: osd.0 pg_epoch: 54 pg[7.6( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:08 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.17 deep-scrub starts Nov 23 03:01:08 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.17 deep-scrub ok Nov 23 03:01:08 localhost ceph-osd[31905]: osd.0 pg_epoch: 55 pg[7.6( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=54/55 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=35'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:08 localhost ceph-osd[31905]: osd.0 pg_epoch: 55 pg[7.e( v 35'39 lc 35'17 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=0 lpr=54 pi=[46,54)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:09 localhost ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.385271072s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1156.089965820s@ mbc={255={}}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:09 localhost ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.385170937s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1156.089965820s@ mbc={}] state: transitioning to Stray Nov 23 03:01:09 localhost ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.384820938s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1156.089965820s@ mbc={255={}}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:09 localhost ceph-osd[32858]: osd.3 pg_epoch: 56 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=11.384694099s) [2,1,3] r=2 lpr=56 pi=[48,56)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1156.089965820s@ mbc={}] state: transitioning to Stray Nov 23 03:01:10 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.16 scrub starts Nov 23 03:01:10 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.16 scrub ok Nov 23 03:01:10 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.14 deep-scrub starts Nov 23 03:01:10 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.14 deep-scrub ok Nov 23 03:01:10 localhost python3[55892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:10 localhost python3[55935]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884870.2241414-92282-101044522076672/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=8a18e979d41caf333cb312628abb5051e6d0049c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:11 localhost ceph-osd[32858]: osd.3 pg_epoch: 58 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.010090828s) [3,2,1] r=0 lpr=58 pi=[42,58)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1161.739379883s@ mbc={}] start_peering_interval up [5,1,3] -> [3,2,1], acting [5,1,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:11 localhost ceph-osd[32858]: osd.3 pg_epoch: 58 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58 pruub=15.010090828s) [3,2,1] r=0 lpr=58 pi=[42,58)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1161.739379883s@ mbc={}] state: transitioning to Primary Nov 23 03:01:12 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 2.1f scrub starts Nov 23 03:01:12 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 2.1f scrub ok Nov 23 03:01:12 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.14 deep-scrub starts Nov 23 03:01:12 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.14 deep-scrub ok Nov 23 03:01:12 localhost ceph-osd[32858]: osd.3 pg_epoch: 59 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=58/59 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=58) [3,2,1] r=0 lpr=58 pi=[42,58)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:01:13 localhost ceph-osd[32858]: osd.3 pg_epoch: 60 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=11.104165077s) [0,4,2] r=-1 lpr=60 pi=[44,60)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1159.868652344s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,2], acting [2,1,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:13 localhost ceph-osd[32858]: osd.3 pg_epoch: 60 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=11.103945732s) [0,4,2] r=-1 lpr=60 pi=[44,60)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1159.868652344s@ mbc={}] state: transitioning to Stray Nov 23 03:01:13 localhost ceph-osd[31905]: osd.0 pg_epoch: 60 pg[7.9( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60) [0,4,2] r=0 lpr=60 pi=[44,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:14 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.12 scrub starts Nov 23 03:01:14 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.12 scrub ok Nov 23 03:01:14 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.11 scrub starts Nov 23 03:01:14 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.11 scrub ok Nov 23 03:01:14 localhost ceph-osd[31905]: osd.0 pg_epoch: 61 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=60/61 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=60) [0,4,2] r=0 lpr=60 pi=[44,60)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 23 03:01:15 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.9 deep-scrub starts Nov 23 03:01:15 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.9 deep-scrub ok Nov 23 03:01:15 localhost ceph-osd[32858]: osd.3 pg_epoch: 62 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=11.047682762s) [4,0,5] r=-1 lpr=62 pi=[46,62)/1 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1161.944580078s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:15 localhost ceph-osd[32858]: osd.3 pg_epoch: 62 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=11.047437668s) [4,0,5] r=-1 lpr=62 pi=[46,62)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1161.944580078s@ mbc={}] state: transitioning to Stray Nov 23 03:01:15 localhost python3[55998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:16 localhost python3[56041]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884875.3758018-92282-218969318650132/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=ae43e71821d6a319ccba3331b262b98567ce770b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:16 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.11 scrub starts Nov 23 03:01:16 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.11 scrub ok Nov 23 03:01:17 localhost ceph-osd[31905]: osd.0 pg_epoch: 62 pg[7.a( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=62) [4,0,5] r=1 lpr=62 pi=[46,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:01:18 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1e deep-scrub starts Nov 23 03:01:18 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1e deep-scrub ok Nov 23 03:01:19 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.b scrub starts Nov 23 03:01:19 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.b scrub ok Nov 23 03:01:20 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.18 deep-scrub starts Nov 23 03:01:20 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 6.18 deep-scrub ok Nov 23 03:01:21 localhost python3[56103]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:21 localhost python3[56148]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884881.2254841-92640-206212653887107/source _original_basename=tmpj1qt_tov follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:22 localhost python3[56210]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:23 localhost python3[56253]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884882.685829-92819-36251608125809/source _original_basename=tmp0gqx0hk1 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:23 localhost ceph-osd[31905]: osd.0 pg_epoch: 64 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=12.218671799s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 crt=35'39 mlcod 0'0 active pruub 1174.969726562s@ mbc={255={}}] start_peering_interval up [0,5,4] -> [2,3,4], acting [0,5,4] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:23 localhost ceph-osd[31905]: osd.0 pg_epoch: 64 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=50/51 n=1 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=12.218530655s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1174.969726562s@ mbc={}] state: transitioning to Stray Nov 23 03:01:23 localhost python3[56283]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Nov 23 03:01:24 localhost python3[56301]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:01:24 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.10 scrub starts Nov 23 03:01:24 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.10 scrub ok Nov 23 03:01:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:01:24 localhost podman[56351]: 2025-11-23 08:01:24.605757137 +0000 UTC m=+0.075081507 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd) Nov 23 03:01:24 localhost ceph-osd[32858]: osd.3 pg_epoch: 64 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=50/50 les/c/f=51/51/0 sis=64) [2,3,4] r=1 lpr=64 pi=[50,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:01:24 localhost podman[56351]: 2025-11-23 08:01:24.773954578 +0000 UTC m=+0.243278918 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:01:24 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:01:25 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.9 scrub starts Nov 23 03:01:25 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.9 scrub ok Nov 23 03:01:25 localhost ansible-async_wrapper.py[56503]: Invoked with 660982539111 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763884885.1060352-92926-116155042352680/AnsiballZ_command.py _ Nov 23 03:01:25 localhost ansible-async_wrapper.py[56506]: Starting module and watcher Nov 23 03:01:25 localhost ansible-async_wrapper.py[56506]: Start watching 56507 (3600) Nov 23 03:01:25 localhost ansible-async_wrapper.py[56507]: Start module (56507) Nov 23 03:01:25 localhost ansible-async_wrapper.py[56503]: Return async_wrapper task started. Nov 23 03:01:25 localhost ceph-osd[31905]: osd.0 pg_epoch: 66 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=52/53 n=1 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=12.301605225s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1177.067016602s@ mbc={}] start_peering_interval up [4,0,2] -> [2,3,1], acting [4,0,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:25 localhost ceph-osd[31905]: osd.0 pg_epoch: 66 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=52/53 n=1 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=12.301326752s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1177.067016602s@ mbc={}] state: transitioning to Stray Nov 23 03:01:25 localhost python3[56527]: ansible-ansible.legacy.async_status Invoked with jid=660982539111.56503 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:01:26 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.f scrub starts Nov 23 03:01:26 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 4.f scrub ok Nov 23 03:01:26 localhost ceph-osd[32858]: osd.3 pg_epoch: 66 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=52/52 les/c/f=53/53/0 sis=66) [2,3,1] r=1 lpr=66 pi=[52,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 23 03:01:27 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.4 scrub starts Nov 23 03:01:27 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.4 scrub ok Nov 23 03:01:27 localhost ceph-osd[32858]: osd.3 pg_epoch: 68 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=0 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:27 localhost ceph-osd[31905]: osd.0 pg_epoch: 68 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=12.778108597s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=35'39 mlcod 35'39 active pruub 1179.634155273s@ mbc={255={}}] start_peering_interval up [0,2,4] -> [3,1,5], acting [0,2,4] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:27 localhost ceph-osd[31905]: osd.0 pg_epoch: 68 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=54/55 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=12.778008461s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1179.634155273s@ mbc={}] state: transitioning to Stray Nov 23 03:01:28 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.7 scrub starts Nov 23 03:01:28 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.7 scrub ok Nov 23 03:01:28 localhost ceph-osd[32858]: osd.3 pg_epoch: 69 pg[7.e( v 35'39 lc 35'17 (0'0,35'39] local-lis/les=68/69 n=1 ec=42/33 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=0 lpr=68 pi=[54,68)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+3)=1}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:29 localhost puppet-user[56525]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 03:01:29 localhost puppet-user[56525]: (file: /etc/puppet/hiera.yaml) Nov 23 03:01:29 localhost puppet-user[56525]: Warning: Undefined variable '::deploy_config_name'; Nov 23 03:01:29 localhost puppet-user[56525]: (file & line not available) Nov 23 03:01:29 localhost puppet-user[56525]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 03:01:29 localhost puppet-user[56525]: (file & line not available) Nov 23 03:01:29 localhost puppet-user[56525]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 23 03:01:29 localhost puppet-user[56525]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 23 03:01:29 localhost puppet-user[56525]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.12 seconds Nov 23 03:01:29 localhost puppet-user[56525]: Notice: Applied catalog in 0.03 seconds Nov 23 03:01:29 localhost puppet-user[56525]: Application: Nov 23 03:01:29 localhost puppet-user[56525]: Initial environment: production Nov 23 03:01:29 localhost puppet-user[56525]: Converged environment: production Nov 23 03:01:29 localhost puppet-user[56525]: Run mode: user Nov 23 03:01:29 localhost puppet-user[56525]: Changes: Nov 23 03:01:29 localhost puppet-user[56525]: Events: Nov 23 03:01:29 localhost puppet-user[56525]: Resources: Nov 23 03:01:29 localhost puppet-user[56525]: Total: 10 Nov 23 03:01:29 localhost puppet-user[56525]: Time: Nov 23 03:01:29 localhost puppet-user[56525]: Schedule: 0.00 Nov 23 03:01:29 localhost puppet-user[56525]: File: 0.00 Nov 23 03:01:29 localhost puppet-user[56525]: Exec: 0.01 Nov 23 03:01:29 localhost puppet-user[56525]: Augeas: 0.01 Nov 23 03:01:29 localhost puppet-user[56525]: Transaction evaluation: 0.03 Nov 23 03:01:29 localhost puppet-user[56525]: Catalog application: 0.03 Nov 23 03:01:29 localhost puppet-user[56525]: Config retrieval: 0.15 Nov 23 03:01:29 localhost puppet-user[56525]: Last run: 1763884889 Nov 23 03:01:29 localhost puppet-user[56525]: Filebucket: 0.00 Nov 23 03:01:29 localhost puppet-user[56525]: Total: 0.04 Nov 23 03:01:29 localhost puppet-user[56525]: Version: Nov 23 03:01:29 localhost puppet-user[56525]: Config: 1763884889 Nov 23 03:01:29 localhost puppet-user[56525]: Puppet: 7.10.0 Nov 23 03:01:29 localhost ansible-async_wrapper.py[56507]: Module complete (56507) Nov 23 03:01:29 localhost ceph-osd[32858]: osd.3 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=56/57 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.056187630s) [0,4,5] r=-1 lpr=70 pi=[56,70)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1178.083862305s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,5], acting [2,1,3] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 23 03:01:29 localhost ceph-osd[32858]: osd.3 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=56/57 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.055901527s) [0,4,5] r=-1 lpr=70 pi=[56,70)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1178.083862305s@ mbc={}] state: transitioning to Stray Nov 23 03:01:29 localhost ceph-osd[31905]: osd.0 pg_epoch: 70 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70) [0,4,5] r=0 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 23 03:01:30 localhost ansible-async_wrapper.py[56506]: Done in kid B. Nov 23 03:01:30 localhost ceph-osd[31905]: osd.0 pg_epoch: 71 pg[7.f( v 35'39 lc 35'1 (0'0,35'39] local-lis/les=70/71 n=1 ec=42/33 lis/c=56/56 les/c/f=57/57/0 sis=70) [0,4,5] r=0 lpr=70 pi=[56,70)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+3)=3}}] state: react AllReplicasActivated Activating complete Nov 23 03:01:31 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.1e scrub starts Nov 23 03:01:31 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 4.1e scrub ok Nov 23 03:01:32 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.1e scrub starts Nov 23 03:01:32 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.1e scrub ok Nov 23 03:01:33 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub starts Nov 23 03:01:35 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.f deep-scrub starts Nov 23 03:01:35 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.f deep-scrub ok Nov 23 03:01:36 localhost python3[56730]: ansible-ansible.legacy.async_status Invoked with jid=660982539111.56503 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:01:36 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.b deep-scrub starts Nov 23 03:01:36 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.b deep-scrub ok Nov 23 03:01:36 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.a scrub starts Nov 23 03:01:36 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.a scrub ok Nov 23 03:01:36 localhost python3[56746]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:01:37 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.13 scrub starts Nov 23 03:01:37 localhost python3[56762]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:01:37 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub starts Nov 23 03:01:37 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.13 scrub ok Nov 23 03:01:37 localhost python3[56812]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:38 localhost python3[56830]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpnb1s9nrx recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:01:38 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.19 deep-scrub starts Nov 23 03:01:38 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.19 deep-scrub ok Nov 23 03:01:38 localhost python3[56860]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:39 localhost python3[56964]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 23 03:01:40 localhost python3[56983]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:41 localhost python3[57015]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:01:41 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.c scrub starts Nov 23 03:01:41 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.c scrub ok Nov 23 03:01:41 localhost python3[57065]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:41 localhost python3[57083]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:42 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.1e scrub starts Nov 23 03:01:42 localhost python3[57145]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:42 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.1e scrub ok Nov 23 03:01:42 localhost python3[57163]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:43 localhost python3[57225]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:43 localhost python3[57243]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:43 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.3 scrub starts Nov 23 03:01:43 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.3 scrub ok Nov 23 03:01:43 localhost python3[57305]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:44 localhost python3[57323]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:44 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.2 scrub starts Nov 23 03:01:44 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.2 scrub ok Nov 23 03:01:44 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.4 deep-scrub starts Nov 23 03:01:44 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.4 deep-scrub ok Nov 23 03:01:44 localhost python3[57353]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:01:44 localhost systemd[1]: Reloading. Nov 23 03:01:44 localhost systemd-rc-local-generator[57375]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:01:44 localhost systemd-sysv-generator[57381]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:01:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:01:45 localhost python3[57438]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:45 localhost python3[57456]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:46 localhost python3[57518]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:01:46 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.6 scrub starts Nov 23 03:01:46 localhost python3[57536]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:46 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.6 scrub ok Nov 23 03:01:46 localhost python3[57566]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:01:46 localhost systemd[1]: Reloading. Nov 23 03:01:47 localhost systemd-rc-local-generator[57587]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:01:47 localhost systemd-sysv-generator[57590]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:01:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:01:47 localhost systemd[1]: Starting Create netns directory... Nov 23 03:01:47 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 03:01:47 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 03:01:47 localhost systemd[1]: Finished Create netns directory. Nov 23 03:01:47 localhost python3[57623]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 03:01:48 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.6 scrub starts Nov 23 03:01:48 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.6 scrub ok Nov 23 03:01:48 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.9 scrub starts Nov 23 03:01:48 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.9 scrub ok Nov 23 03:01:49 localhost python3[57682]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 23 03:01:49 localhost podman[57754]: 2025-11-23 08:01:49.467780956 +0000 UTC m=+0.075042696 container create 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 23 03:01:49 localhost podman[57761]: 2025-11-23 08:01:49.500796988 +0000 UTC m=+0.096265255 container create 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_id=tripleo_step2, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_virtqemud_init_logs, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1) Nov 23 03:01:49 localhost systemd[1]: Started libpod-conmon-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope. Nov 23 03:01:49 localhost systemd[1]: Started libpod-conmon-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope. Nov 23 03:01:49 localhost podman[57754]: 2025-11-23 08:01:49.432494988 +0000 UTC m=+0.039756728 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:01:49 localhost podman[57761]: 2025-11-23 08:01:49.43635206 +0000 UTC m=+0.031820317 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:01:49 localhost systemd[1]: Started libcrun container. Nov 23 03:01:49 localhost systemd[1]: Started libcrun container. Nov 23 03:01:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:01:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 23 03:01:49 localhost podman[57754]: 2025-11-23 08:01:49.55544624 +0000 UTC m=+0.162707980 container init 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step2, container_name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 23 03:01:49 localhost podman[57754]: 2025-11-23 08:01:49.564151263 +0000 UTC m=+0.171413003 container start 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 23 03:01:49 localhost python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Nov 23 03:01:49 localhost systemd[1]: libpod-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope: Deactivated successfully. Nov 23 03:01:49 localhost podman[57761]: 2025-11-23 08:01:49.607105934 +0000 UTC m=+0.202574191 container init 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, tcib_managed=true, container_name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044) Nov 23 03:01:49 localhost podman[57761]: 2025-11-23 08:01:49.614101469 +0000 UTC m=+0.209569756 container start 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step2, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:01:49 localhost python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Nov 23 03:01:49 localhost systemd[1]: libpod-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope: Deactivated successfully. Nov 23 03:01:49 localhost podman[57790]: 2025-11-23 08:01:49.640986771 +0000 UTC m=+0.059970207 container died 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, container_name=nova_compute_init_log, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:01:49 localhost podman[57814]: 2025-11-23 08:01:49.679991507 +0000 UTC m=+0.043124386 container died 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_virtqemud_init_logs, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}) Nov 23 03:01:49 localhost podman[57791]: 2025-11-23 08:01:49.710377112 +0000 UTC m=+0.127374610 container cleanup 162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:36:58Z) Nov 23 03:01:49 localhost systemd[1]: libpod-conmon-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175.scope: Deactivated successfully. Nov 23 03:01:49 localhost podman[57814]: 2025-11-23 08:01:49.750307535 +0000 UTC m=+0.113440374 container cleanup 6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step2, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, container_name=nova_virtqemud_init_logs, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com) Nov 23 03:01:49 localhost systemd[1]: libpod-conmon-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a.scope: Deactivated successfully. Nov 23 03:01:50 localhost podman[57932]: 2025-11-23 08:01:50.036197453 +0000 UTC m=+0.058274319 container create 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container) Nov 23 03:01:50 localhost systemd[1]: Started libpod-conmon-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope. Nov 23 03:01:50 localhost podman[57947]: 2025-11-23 08:01:50.072905602 +0000 UTC m=+0.070193286 container create 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.12, container_name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 23 03:01:50 localhost systemd[1]: Started libcrun container. Nov 23 03:01:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 23 03:01:50 localhost podman[57932]: 2025-11-23 08:01:50.090538506 +0000 UTC m=+0.112615382 container init 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:01:50 localhost systemd[1]: Started libpod-conmon-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope. Nov 23 03:01:50 localhost podman[57932]: 2025-11-23 08:01:50.098643242 +0000 UTC m=+0.120720108 container start 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.4, container_name=create_virtlogd_wrapper, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1) Nov 23 03:01:50 localhost podman[57932]: 2025-11-23 08:01:50.098757346 +0000 UTC m=+0.120834212 container attach 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Nov 23 03:01:50 localhost podman[57932]: 2025-11-23 08:01:50.007561789 +0000 UTC m=+0.029638665 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:01:50 localhost systemd[1]: Started libcrun container. Nov 23 03:01:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 03:01:50 localhost podman[57947]: 2025-11-23 08:01:50.128643525 +0000 UTC m=+0.125931209 container init 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-type=git, container_name=create_haproxy_wrapper, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:01:50 localhost podman[57947]: 2025-11-23 08:01:50.033515714 +0000 UTC m=+0.030803468 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 03:01:50 localhost podman[57947]: 2025-11-23 08:01:50.134096464 +0000 UTC m=+0.131384178 container start 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=create_haproxy_wrapper, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:01:50 localhost podman[57947]: 2025-11-23 08:01:50.134375592 +0000 UTC m=+0.131663346 container attach 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:01:50 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.f scrub starts Nov 23 03:01:50 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 7.f scrub ok Nov 23 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay-fdd0c049f9b5a6a5bc490e6d508b34f697b7d4ec5cca2012ac4af136c1e57a22-merged.mount: Deactivated successfully. Nov 23 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b98f47b7efe3de3c4dba29aea35ff7ab16ef615084c625b230146db7d230b2a-userdata-shm.mount: Deactivated successfully. Nov 23 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay-ff643707b40d1f4d028ddd9677b38d4c09285d4aaee2a2759e81caea717feb00-merged.mount: Deactivated successfully. Nov 23 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-162b13369c7fb9a5e855b4d534d104faede8777e1586dce410ed58efc88fa175-userdata-shm.mount: Deactivated successfully. Nov 23 03:01:51 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub starts Nov 23 03:01:51 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.19 scrub ok Nov 23 03:01:51 localhost ovs-vsctl[58038]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 23 03:01:52 localhost systemd[1]: libpod-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Deactivated successfully. Nov 23 03:01:52 localhost systemd[1]: libpod-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Consumed 2.141s CPU time. Nov 23 03:01:52 localhost podman[57932]: 2025-11-23 08:01:52.246208216 +0000 UTC m=+2.268285142 container died 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=create_virtlogd_wrapper) Nov 23 03:01:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f-userdata-shm.mount: Deactivated successfully. Nov 23 03:01:52 localhost systemd[1]: var-lib-containers-storage-overlay-1cc40e2e89ec91462a29529e884582c68bbf460125466873b1904744d0a186a6-merged.mount: Deactivated successfully. Nov 23 03:01:52 localhost podman[58187]: 2025-11-23 08:01:52.349148645 +0000 UTC m=+0.088693205 container cleanup 1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, release=1761123044, container_name=create_virtlogd_wrapper, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:01:52 localhost systemd[1]: libpod-conmon-1babc2617bfb08e576a89388c6c7f451acfa5db14747afa5de6e06c1473dcf9f.scope: Deactivated successfully. Nov 23 03:01:52 localhost python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Nov 23 03:01:52 localhost systemd[1]: libpod-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Deactivated successfully. Nov 23 03:01:53 localhost systemd[1]: libpod-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Consumed 2.149s CPU time. Nov 23 03:01:53 localhost podman[57947]: 2025-11-23 08:01:53.001604149 +0000 UTC m=+2.998891893 container died 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, container_name=create_haproxy_wrapper, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:01:53 localhost podman[58227]: 2025-11-23 08:01:53.09225526 +0000 UTC m=+0.081914787 container cleanup 76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, container_name=create_haproxy_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step2, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:01:53 localhost systemd[1]: libpod-conmon-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218.scope: Deactivated successfully. Nov 23 03:01:53 localhost python3[57682]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Nov 23 03:01:53 localhost systemd[1]: var-lib-containers-storage-overlay-aaa446bba65b974176b0df44f4dd288068deedfe5e98c6c11dbc22e7f9239473-merged.mount: Deactivated successfully. Nov 23 03:01:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76d3986a329da87f17dfb20727e8d9a5e21d7c96efc5740e0dc97f455efc7218-userdata-shm.mount: Deactivated successfully. Nov 23 03:01:53 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts Nov 23 03:01:53 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok Nov 23 03:01:53 localhost python3[58283]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:01:55 localhost systemd[1]: tmp-crun.qR1JJo.mount: Deactivated successfully. Nov 23 03:01:55 localhost podman[58389]: 2025-11-23 08:01:55.053402754 +0000 UTC m=+0.102601469 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc.) Nov 23 03:01:55 localhost python3[58421]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005532585 step=2 update_config_hash_only=False Nov 23 03:01:55 localhost podman[58389]: 2025-11-23 08:01:55.254871102 +0000 UTC m=+0.304069757 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 23 03:01:55 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:01:55 localhost python3[58450]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:01:56 localhost python3[58466]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 03:01:56 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub starts Nov 23 03:01:56 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 3.1 scrub ok Nov 23 03:01:56 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1 deep-scrub starts Nov 23 03:01:56 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.1 deep-scrub ok Nov 23 03:01:57 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub starts Nov 23 03:01:57 localhost ceph-osd[31905]: log_channel(cluster) log [DBG] : 5.5 scrub ok Nov 23 03:01:57 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.4 scrub starts Nov 23 03:01:57 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.4 scrub ok Nov 23 03:01:59 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.b scrub starts Nov 23 03:01:59 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.b scrub ok Nov 23 03:02:00 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.18 scrub starts Nov 23 03:02:00 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 3.18 scrub ok Nov 23 03:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4438 writes, 20K keys, 4438 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4438 writes, 447 syncs, 9.93 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1185 writes, 4110 keys, 1185 commit groups, 1.0 writes per commit group, ingest: 2.02 MB, 0.00 MB/s#012Interval WAL: 1185 writes, 305 syncs, 3.89 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Nov 23 03:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5124 writes, 22K keys, 5124 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5123 writes, 575 syncs, 8.91 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1737 writes, 6023 keys, 1737 commit groups, 1.0 writes per commit group, ingest: 2.57 MB, 0.00 MB/s#012Interval WAL: 1736 writes, 377 syncs, 4.60 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Nov 23 03:02:07 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub starts Nov 23 03:02:15 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1d scrub starts Nov 23 03:02:15 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1d scrub ok Nov 23 03:02:24 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.2 scrub starts Nov 23 03:02:24 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.2 scrub ok Nov 23 03:02:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:02:26 localhost podman[58467]: 2025-11-23 08:02:26.021648679 +0000 UTC m=+0.080233258 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:02:26 localhost podman[58467]: 2025-11-23 08:02:26.205266837 +0000 UTC m=+0.263851416 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:02:26 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:02:30 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.b scrub starts Nov 23 03:02:30 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.b scrub ok Nov 23 03:02:31 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.3 scrub starts Nov 23 03:02:31 localhost sshd[58496]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:02:31 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.3 scrub ok Nov 23 03:02:33 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.8 scrub starts Nov 23 03:02:33 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.8 scrub ok Nov 23 03:02:37 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.e scrub starts Nov 23 03:02:37 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 7.e scrub ok Nov 23 03:02:44 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub starts Nov 23 03:02:44 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.19 scrub ok Nov 23 03:02:49 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e scrub starts Nov 23 03:02:49 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.e scrub ok Nov 23 03:02:53 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub starts Nov 23 03:02:53 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 2.6 scrub ok Nov 23 03:02:54 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub starts Nov 23 03:02:54 localhost ceph-osd[32858]: log_channel(cluster) log [DBG] : 6.1f scrub ok Nov 23 03:02:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:02:57 localhost podman[58625]: 2025-11-23 08:02:57.01799793 +0000 UTC m=+0.071809842 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:02:57 localhost podman[58625]: 2025-11-23 08:02:57.224495095 +0000 UTC m=+0.278306987 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=) Nov 23 03:02:57 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:03:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:03:28 localhost podman[58654]: 2025-11-23 08:03:28.023699633 +0000 UTC m=+0.085225492 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:03:28 localhost podman[58654]: 2025-11-23 08:03:28.238293827 +0000 UTC m=+0.299819696 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public) Nov 23 03:03:28 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:03:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:03:59 localhost podman[58759]: 2025-11-23 08:03:59.01972001 +0000 UTC m=+0.081536332 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:03:59 localhost podman[58759]: 2025-11-23 08:03:59.232460068 +0000 UTC m=+0.294276410 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:03:59 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:04:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:04:30 localhost systemd[1]: tmp-crun.yoIh0o.mount: Deactivated successfully. Nov 23 03:04:30 localhost podman[58788]: 2025-11-23 08:04:30.027750836 +0000 UTC m=+0.085415848 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:04:30 localhost podman[58788]: 2025-11-23 08:04:30.253264837 +0000 UTC m=+0.310929839 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Nov 23 03:04:30 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:04:36 localhost podman[58915]: 2025-11-23 08:04:36.207777544 +0000 UTC m=+0.092108379 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Nov 23 03:04:36 localhost podman[58915]: 2025-11-23 08:04:36.340535007 +0000 UTC m=+0.224865852 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main) Nov 23 03:05:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:05:01 localhost systemd[1]: tmp-crun.w39ePM.mount: Deactivated successfully. Nov 23 03:05:01 localhost podman[59057]: 2025-11-23 08:05:01.035150313 +0000 UTC m=+0.093427777 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z) Nov 23 03:05:01 localhost podman[59057]: 2025-11-23 08:05:01.251375845 +0000 UTC m=+0.309653319 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:05:01 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:05:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:05:32 localhost systemd[1]: tmp-crun.xgp8Kq.mount: Deactivated successfully. Nov 23 03:05:32 localhost podman[59086]: 2025-11-23 08:05:32.024195129 +0000 UTC m=+0.082386967 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4) Nov 23 03:05:32 localhost podman[59086]: 2025-11-23 08:05:32.218211167 +0000 UTC m=+0.276403015 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Nov 23 03:05:32 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:05:32 localhost sshd[59115]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:05:46 localhost sshd[59194]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:06:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:06:03 localhost systemd[1]: tmp-crun.82xQiu.mount: Deactivated successfully. Nov 23 03:06:03 localhost podman[59195]: 2025-11-23 08:06:03.030134502 +0000 UTC m=+0.085942354 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:06:03 localhost podman[59195]: 2025-11-23 08:06:03.221451298 +0000 UTC m=+0.277259080 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044) Nov 23 03:06:03 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:06:29 localhost python3[59271]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:29 localhost python3[59316]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885188.984136-98999-32296130414344/source _original_basename=tmp6hnezlw9 follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:30 localhost python3[59346]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:06:32 localhost ansible-async_wrapper.py[59518]: Invoked with 70114099573 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885191.729863-99301-101695479229977/AnsiballZ_command.py _ Nov 23 03:06:32 localhost ansible-async_wrapper.py[59521]: Starting module and watcher Nov 23 03:06:32 localhost ansible-async_wrapper.py[59521]: Start watching 59522 (3600) Nov 23 03:06:32 localhost ansible-async_wrapper.py[59522]: Start module (59522) Nov 23 03:06:32 localhost ansible-async_wrapper.py[59518]: Return async_wrapper task started. Nov 23 03:06:32 localhost python3[59542]: ansible-ansible.legacy.async_status Invoked with jid=70114099573.59518 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:06:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:06:33 localhost systemd[1]: tmp-crun.RHE8K2.mount: Deactivated successfully. Nov 23 03:06:34 localhost podman[59560]: 2025-11-23 08:06:33.999434325 +0000 UTC m=+0.062064125 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:06:34 localhost podman[59560]: 2025-11-23 08:06:34.166799307 +0000 UTC m=+0.229429097 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:06:34 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:06:35 localhost puppet-user[59526]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 03:06:35 localhost puppet-user[59526]: (file: /etc/puppet/hiera.yaml) Nov 23 03:06:35 localhost puppet-user[59526]: Warning: Undefined variable '::deploy_config_name'; Nov 23 03:06:35 localhost puppet-user[59526]: (file & line not available) Nov 23 03:06:35 localhost puppet-user[59526]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 03:06:35 localhost puppet-user[59526]: (file & line not available) Nov 23 03:06:35 localhost puppet-user[59526]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 23 03:06:36 localhost puppet-user[59526]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 23 03:06:36 localhost puppet-user[59526]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.11 seconds Nov 23 03:06:36 localhost puppet-user[59526]: Notice: Applied catalog in 0.03 seconds Nov 23 03:06:36 localhost puppet-user[59526]: Application: Nov 23 03:06:36 localhost puppet-user[59526]: Initial environment: production Nov 23 03:06:36 localhost puppet-user[59526]: Converged environment: production Nov 23 03:06:36 localhost puppet-user[59526]: Run mode: user Nov 23 03:06:36 localhost puppet-user[59526]: Changes: Nov 23 03:06:36 localhost puppet-user[59526]: Events: Nov 23 03:06:36 localhost puppet-user[59526]: Resources: Nov 23 03:06:36 localhost puppet-user[59526]: Total: 10 Nov 23 03:06:36 localhost puppet-user[59526]: Time: Nov 23 03:06:36 localhost puppet-user[59526]: Schedule: 0.00 Nov 23 03:06:36 localhost puppet-user[59526]: File: 0.00 Nov 23 03:06:36 localhost puppet-user[59526]: Exec: 0.01 Nov 23 03:06:36 localhost puppet-user[59526]: Augeas: 0.01 Nov 23 03:06:36 localhost puppet-user[59526]: Transaction evaluation: 0.03 Nov 23 03:06:36 localhost puppet-user[59526]: Catalog application: 0.03 Nov 23 03:06:36 localhost puppet-user[59526]: Config retrieval: 0.15 Nov 23 03:06:36 localhost puppet-user[59526]: Last run: 1763885196 Nov 23 03:06:36 localhost puppet-user[59526]: Filebucket: 0.00 Nov 23 03:06:36 localhost puppet-user[59526]: Total: 0.04 Nov 23 03:06:36 localhost puppet-user[59526]: Version: Nov 23 03:06:36 localhost puppet-user[59526]: Config: 1763885195 Nov 23 03:06:36 localhost puppet-user[59526]: Puppet: 7.10.0 Nov 23 03:06:36 localhost ansible-async_wrapper.py[59522]: Module complete (59522) Nov 23 03:06:37 localhost ansible-async_wrapper.py[59521]: Done in kid B. Nov 23 03:06:42 localhost python3[59776]: ansible-ansible.legacy.async_status Invoked with jid=70114099573.59518 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:06:43 localhost python3[59792]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:06:43 localhost python3[59808]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:06:44 localhost python3[59858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:44 localhost python3[59876]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpnq9niry8 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:06:45 localhost python3[59906]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:46 localhost python3[60009]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 23 03:06:47 localhost python3[60028]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:48 localhost python3[60061]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:06:48 localhost python3[60111]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:49 localhost python3[60129]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:49 localhost python3[60191]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:49 localhost python3[60209]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:50 localhost python3[60271]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:50 localhost python3[60289]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:51 localhost python3[60351]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:51 localhost python3[60369]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:51 localhost python3[60399]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:06:51 localhost systemd[1]: Reloading. Nov 23 03:06:52 localhost systemd-sysv-generator[60428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:06:52 localhost systemd-rc-local-generator[60423]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:06:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:06:52 localhost python3[60484]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:52 localhost python3[60502]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:53 localhost python3[60564]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:06:53 localhost python3[60582]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:06:54 localhost python3[60612]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:06:54 localhost systemd[1]: Reloading. Nov 23 03:06:54 localhost systemd-sysv-generator[60636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:06:54 localhost systemd-rc-local-generator[60632]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:06:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:06:54 localhost systemd[1]: Starting Create netns directory... Nov 23 03:06:54 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 03:06:54 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 03:06:54 localhost systemd[1]: Finished Create netns directory. Nov 23 03:06:55 localhost python3[60670]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 03:06:56 localhost python3[60728]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 23 03:06:57 localhost podman[60884]: 2025-11-23 08:06:57.174063813 +0000 UTC m=+0.068027434 container create 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 23 03:06:57 localhost podman[60913]: 2025-11-23 08:06:57.198972985 +0000 UTC m=+0.066878070 container create 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:35:22Z, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public) Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.221834365 +0000 UTC m=+0.086982636 container create 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, container_name=nova_statedir_owner, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}) Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope. Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope. Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad/merged/scripts supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[60914]: 2025-11-23 08:06:57.240722146 +0000 UTC m=+0.099057002 container create 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost podman[60884]: 2025-11-23 08:06:57.143633685 +0000 UTC m=+0.037597336 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope. Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[60913]: 2025-11-23 08:06:57.252939744 +0000 UTC m=+0.120844839 container init 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3493da2ad5d9347a99f6c3e68d066c1617ca47d6ffc95a23d03eddf6b80d5617/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[60913]: 2025-11-23 08:06:57.260202894 +0000 UTC m=+0.128107979 container start 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, container_name=nova_virtlogd_wrapper, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step3, release=1761123044) Nov 23 03:06:57 localhost podman[60913]: 2025-11-23 08:06:57.166879936 +0000 UTC m=+0.034785041 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.262676788 +0000 UTC m=+0.127825059 container init 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, build-date=2025-11-19T00:36:58Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope. Nov 23 03:06:57 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.1680026 +0000 UTC m=+0.033150891 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.269174134 +0000 UTC m=+0.134322395 container start 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.26937296 +0000 UTC m=+0.134521261 container attach 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step3) Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d5e4e29a67d68eb763c810cf5dda69eb1a37e523f562c6e80552f33c1fd3c8b/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[60914]: 2025-11-23 08:06:57.281358622 +0000 UTC m=+0.139693468 container init 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_init_log, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step3, build-date=2025-11-19T00:12:45Z) Nov 23 03:06:57 localhost podman[60914]: 2025-11-23 08:06:57.187285933 +0000 UTC m=+0.045620799 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 23 03:06:57 localhost podman[60914]: 2025-11-23 08:06:57.287372693 +0000 UTC m=+0.145707539 container start 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step3, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 23 03:06:57 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:57 localhost systemd[1]: libpod-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope: Deactivated successfully. Nov 23 03:06:57 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Nov 23 03:06:57 localhost podman[60933]: 2025-11-23 08:06:57.195339495 +0000 UTC m=+0.041495763 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 23 03:06:57 localhost systemd[1]: Created slice User Slice of UID 0. Nov 23 03:06:57 localhost podman[60933]: 2025-11-23 08:06:57.304752768 +0000 UTC m=+0.150909036 container create 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=rsyslog, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 23 03:06:57 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 23 03:06:57 localhost systemd[1]: libpod-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope: Deactivated successfully. Nov 23 03:06:57 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 23 03:06:57 localhost podman[60912]: 2025-11-23 08:06:57.321719021 +0000 UTC m=+0.186867302 container died 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, build-date=2025-11-19T00:36:58Z, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12) Nov 23 03:06:57 localhost systemd[1]: Starting User Manager for UID 0... Nov 23 03:06:57 localhost podman[60998]: 2025-11-23 08:06:57.348747196 +0000 UTC m=+0.040132932 container died 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, container_name=ceilometer_init_log, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope. Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[61002]: 2025-11-23 08:06:57.427003748 +0000 UTC m=+0.118012733 container cleanup 348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_init_log, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 23 03:06:57 localhost systemd[1]: libpod-conmon-348756262d3b77a71623bf5d81534863c67fff2e51ad6b64ceaad1049c932613.scope: Deactivated successfully. Nov 23 03:06:57 localhost podman[61022]: 2025-11-23 08:06:57.444669531 +0000 UTC m=+0.108048422 container cleanup 4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:06:57 localhost systemd[1]: libpod-conmon-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f.scope: Deactivated successfully. Nov 23 03:06:57 localhost systemd[61020]: Queued start job for default target Main User Target. Nov 23 03:06:57 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Nov 23 03:06:57 localhost systemd[61020]: Created slice User Application Slice. Nov 23 03:06:57 localhost systemd[61020]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 23 03:06:57 localhost systemd[61020]: Started Daily Cleanup of User's Temporary Directories. Nov 23 03:06:57 localhost systemd[61020]: Reached target Paths. Nov 23 03:06:57 localhost systemd[61020]: Reached target Timers. Nov 23 03:06:57 localhost systemd[61020]: Starting D-Bus User Message Bus Socket... Nov 23 03:06:57 localhost systemd[61020]: Starting Create User's Volatile Files and Directories... Nov 23 03:06:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:06:57 localhost podman[60884]: 2025-11-23 08:06:57.469093719 +0000 UTC m=+0.363057380 container init 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-collectd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git) Nov 23 03:06:57 localhost systemd[61020]: Finished Create User's Volatile Files and Directories. Nov 23 03:06:57 localhost podman[60933]: 2025-11-23 08:06:57.477225234 +0000 UTC m=+0.323381482 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, container_name=rsyslog, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:06:57 localhost systemd[61020]: Listening on D-Bus User Message Bus Socket. Nov 23 03:06:57 localhost systemd[61020]: Reached target Sockets. Nov 23 03:06:57 localhost systemd[61020]: Reached target Basic System. Nov 23 03:06:57 localhost systemd[61020]: Reached target Main User Target. Nov 23 03:06:57 localhost systemd[61020]: Startup finished in 117ms. Nov 23 03:06:57 localhost systemd[1]: Started User Manager for UID 0. Nov 23 03:06:57 localhost systemd[1]: Started Session c1 of User root. Nov 23 03:06:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:06:57 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:57 localhost podman[60884]: 2025-11-23 08:06:57.49629591 +0000 UTC m=+0.390259551 container start 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:06:57 localhost systemd[1]: Started Session c2 of User root. Nov 23 03:06:57 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4767aaabc3de112d8791c290aa2b669d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 23 03:06:57 localhost podman[60933]: 2025-11-23 08:06:57.502669653 +0000 UTC m=+0.348825891 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=rsyslog, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog) Nov 23 03:06:57 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=7238f2997345c97f4c6ab424e622dc1b --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 23 03:06:57 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:06:57 localhost systemd[1]: session-c2.scope: Deactivated successfully. Nov 23 03:06:57 localhost systemd[1]: session-c1.scope: Deactivated successfully. Nov 23 03:06:57 localhost podman[61115]: 2025-11-23 08:06:57.596610839 +0000 UTC m=+0.079520602 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:06:57 localhost podman[61100]: 2025-11-23 08:06:57.569562472 +0000 UTC m=+0.076323925 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:06:57 localhost podman[61100]: 2025-11-23 08:06:57.698954568 +0000 UTC m=+0.205716021 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:06:57 localhost podman[61100]: unhealthy Nov 23 03:06:57 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:06:57 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed with result 'exit-code'. Nov 23 03:06:57 localhost podman[61146]: 2025-11-23 08:06:57.844737738 +0000 UTC m=+0.281244270 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, container_name=rsyslog, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 23 03:06:57 localhost systemd[1]: libpod-conmon-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:06:57 localhost podman[61250]: 2025-11-23 08:06:57.878980602 +0000 UTC m=+0.099070971 container create 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:06:57 localhost systemd[1]: Started libpod-conmon-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope. Nov 23 03:06:57 localhost systemd[1]: Started libcrun container. Nov 23 03:06:57 localhost podman[61250]: 2025-11-23 08:06:57.828811578 +0000 UTC m=+0.048901967 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:57 localhost podman[61250]: 2025-11-23 08:06:57.945352316 +0000 UTC m=+0.165442655 container init 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:57 localhost podman[61250]: 2025-11-23 08:06:57.954819662 +0000 UTC m=+0.174910031 container start 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:58 localhost podman[61310]: 2025-11-23 08:06:58.053577453 +0000 UTC m=+0.079314236 container create a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, container_name=nova_virtsecretd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:58 localhost systemd[1]: Started libpod-conmon-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope. Nov 23 03:06:58 localhost systemd[1]: Started libcrun container. Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost podman[61310]: 2025-11-23 08:06:58.110683626 +0000 UTC m=+0.136420429 container init a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtsecretd) Nov 23 03:06:58 localhost podman[61310]: 2025-11-23 08:06:58.017617077 +0000 UTC m=+0.043353920 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:58 localhost podman[61310]: 2025-11-23 08:06:58.119287206 +0000 UTC m=+0.145024009 container start a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:58 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:58 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:58 localhost systemd[1]: Started Session c3 of User root. Nov 23 03:06:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a34ae8fe8dce60dc6783f45be95c48eb83b5281196fa46222bba10da8a7475f-userdata-shm.mount: Deactivated successfully. Nov 23 03:06:58 localhost systemd[1]: session-c3.scope: Deactivated successfully. Nov 23 03:06:58 localhost podman[61447]: 2025-11-23 08:06:58.5616601 +0000 UTC m=+0.086066529 container create 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:06:58 localhost podman[61464]: 2025-11-23 08:06:58.604430171 +0000 UTC m=+0.085113140 container create aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:58 localhost systemd[1]: Started libpod-conmon-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope. Nov 23 03:06:58 localhost podman[61447]: 2025-11-23 08:06:58.522349813 +0000 UTC m=+0.046756222 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 23 03:06:58 localhost systemd[1]: Started libcrun container. Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250/merged/etc/target supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost systemd[1]: Started libpod-conmon-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope. Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost podman[61464]: 2025-11-23 08:06:58.551870525 +0000 UTC m=+0.032553534 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:58 localhost systemd[1]: Started libcrun container. Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:58 localhost podman[61464]: 2025-11-23 08:06:58.690840469 +0000 UTC m=+0.171523438 container init aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:06:58 localhost podman[61447]: 2025-11-23 08:06:58.696361496 +0000 UTC m=+0.220767965 container init 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:06:58 localhost podman[61464]: 2025-11-23 08:06:58.700137521 +0000 UTC m=+0.180820490 container start aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 23 03:06:58 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:06:58 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:58 localhost podman[61447]: 2025-11-23 08:06:58.730214068 +0000 UTC m=+0.254620497 container start 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:06:58 localhost systemd[1]: Started Session c4 of User root. Nov 23 03:06:58 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=67452ffc3d9e727585009ffc9989a224 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 23 03:06:58 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:58 localhost systemd[1]: Started Session c5 of User root. Nov 23 03:06:58 localhost systemd[1]: session-c4.scope: Deactivated successfully. Nov 23 03:06:58 localhost kernel: Loading iSCSI transport class v2.0-870. Nov 23 03:06:58 localhost systemd[1]: session-c5.scope: Deactivated successfully. Nov 23 03:06:58 localhost podman[61501]: 2025-11-23 08:06:58.878476924 +0000 UTC m=+0.142625187 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:06:58 localhost podman[61501]: 2025-11-23 08:06:58.962210441 +0000 UTC m=+0.226358714 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=) Nov 23 03:06:58 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:06:59 localhost podman[61632]: 2025-11-23 08:06:59.228650224 +0000 UTC m=+0.073543491 container create 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible) Nov 23 03:06:59 localhost systemd[1]: Started libpod-conmon-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope. Nov 23 03:06:59 localhost podman[61632]: 2025-11-23 08:06:59.185665857 +0000 UTC m=+0.030559144 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:59 localhost systemd[1]: Started libcrun container. Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost podman[61632]: 2025-11-23 08:06:59.307820385 +0000 UTC m=+0.152713652 container init 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z) Nov 23 03:06:59 localhost podman[61632]: 2025-11-23 08:06:59.315746103 +0000 UTC m=+0.160639310 container start 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtstoraged, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:06:59 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:59 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:59 localhost systemd[1]: Started Session c6 of User root. Nov 23 03:06:59 localhost systemd[1]: session-c6.scope: Deactivated successfully. Nov 23 03:06:59 localhost podman[61738]: 2025-11-23 08:06:59.708340774 +0000 UTC m=+0.071793887 container create 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtqemud, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:06:59 localhost systemd[1]: Started libpod-conmon-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope. Nov 23 03:06:59 localhost systemd[1]: Started libcrun container. Nov 23 03:06:59 localhost podman[61738]: 2025-11-23 08:06:59.666078489 +0000 UTC m=+0.029531682 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:06:59 localhost podman[61738]: 2025-11-23 08:06:59.772362177 +0000 UTC m=+0.135815290 container init 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, container_name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:06:59 localhost podman[61738]: 2025-11-23 08:06:59.78173316 +0000 UTC m=+0.145186273 container start 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:06:59 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:06:59 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:06:59 localhost systemd[1]: Started Session c7 of User root. Nov 23 03:06:59 localhost systemd[1]: session-c7.scope: Deactivated successfully. Nov 23 03:07:00 localhost podman[61842]: 2025-11-23 08:07:00.142846151 +0000 UTC m=+0.082082079 container create 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 03:07:00 localhost systemd[1]: Started libpod-conmon-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope. Nov 23 03:07:00 localhost systemd[1]: Started libcrun container. Nov 23 03:07:00 localhost podman[61842]: 2025-11-23 08:07:00.103509614 +0000 UTC m=+0.042745502 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:00 localhost podman[61842]: 2025-11-23 08:07:00.213422622 +0000 UTC m=+0.152658500 container init 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, container_name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 03:07:00 localhost podman[61842]: 2025-11-23 08:07:00.22427988 +0000 UTC m=+0.163515758 container start 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:35:22Z, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:07:00 localhost python3[60728]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:07:00 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:07:00 localhost systemd[1]: Started Session c8 of User root. Nov 23 03:07:00 localhost systemd[1]: session-c8.scope: Deactivated successfully. Nov 23 03:07:00 localhost python3[61921]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:01 localhost python3[61937]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:01 localhost python3[61953]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:01 localhost python3[61969]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:01 localhost python3[61985]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:02 localhost python3[62001]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:02 localhost python3[62017]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:02 localhost python3[62033]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:02 localhost python3[62049]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:03 localhost python3[62065]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:03 localhost python3[62081]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:03 localhost python3[62097]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:03 localhost python3[62113]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:04 localhost python3[62129]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:07:04 localhost systemd[1]: tmp-crun.80Ricb.mount: Deactivated successfully. Nov 23 03:07:04 localhost podman[62146]: 2025-11-23 08:07:04.397515806 +0000 UTC m=+0.091024199 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 23 03:07:04 localhost python3[62145]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:04 localhost podman[62146]: 2025-11-23 08:07:04.584783559 +0000 UTC m=+0.278291902 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git) Nov 23 03:07:04 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:07:04 localhost python3[62190]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:04 localhost python3[62207]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:05 localhost python3[62223]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:07:05 localhost python3[62284]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:06 localhost python3[62313]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:06 localhost python3[62342]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:07 localhost python3[62371]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:07 localhost python3[62400]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:08 localhost python3[62429]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:08 localhost python3[62458]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:09 localhost python3[62487]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:09 localhost python3[62516]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885225.2753856-100428-267011838361084/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:10 localhost python3[62532]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 03:07:10 localhost systemd[1]: Reloading. Nov 23 03:07:10 localhost systemd-rc-local-generator[62557]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:10 localhost systemd-sysv-generator[62562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:10 localhost systemd[1]: Stopping User Manager for UID 0... Nov 23 03:07:10 localhost systemd[61020]: Activating special unit Exit the Session... Nov 23 03:07:10 localhost systemd[61020]: Stopped target Main User Target. Nov 23 03:07:10 localhost systemd[61020]: Stopped target Basic System. Nov 23 03:07:10 localhost systemd[61020]: Stopped target Paths. Nov 23 03:07:10 localhost systemd[61020]: Stopped target Sockets. Nov 23 03:07:10 localhost systemd[61020]: Stopped target Timers. Nov 23 03:07:10 localhost systemd[61020]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 03:07:10 localhost systemd[61020]: Closed D-Bus User Message Bus Socket. Nov 23 03:07:10 localhost systemd[61020]: Stopped Create User's Volatile Files and Directories. Nov 23 03:07:10 localhost systemd[61020]: Removed slice User Application Slice. Nov 23 03:07:10 localhost systemd[61020]: Reached target Shutdown. Nov 23 03:07:10 localhost systemd[61020]: Finished Exit the Session. Nov 23 03:07:10 localhost systemd[61020]: Reached target Exit the Session. Nov 23 03:07:10 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 23 03:07:10 localhost systemd[1]: Stopped User Manager for UID 0. Nov 23 03:07:10 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 23 03:07:10 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 23 03:07:10 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 23 03:07:10 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 23 03:07:10 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 23 03:07:10 localhost python3[62585]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:11 localhost systemd[1]: Reloading. Nov 23 03:07:11 localhost systemd-sysv-generator[62613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:11 localhost systemd-rc-local-generator[62609]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:11 localhost systemd[1]: Starting collectd container... Nov 23 03:07:11 localhost systemd[1]: Started collectd container. Nov 23 03:07:11 localhost python3[62651]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:13 localhost systemd[1]: Reloading. Nov 23 03:07:13 localhost systemd-sysv-generator[62679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:13 localhost systemd-rc-local-generator[62676]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:13 localhost systemd[1]: Starting iscsid container... Nov 23 03:07:13 localhost systemd[1]: Started iscsid container. Nov 23 03:07:13 localhost python3[62719]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:13 localhost systemd[1]: Reloading. Nov 23 03:07:14 localhost systemd-rc-local-generator[62745]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:14 localhost systemd-sysv-generator[62750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:14 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Nov 23 03:07:14 localhost systemd[1]: Started nova_virtlogd_wrapper container. Nov 23 03:07:15 localhost python3[62786]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:15 localhost systemd[1]: Reloading. Nov 23 03:07:15 localhost systemd-sysv-generator[62814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:15 localhost systemd-rc-local-generator[62810]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:15 localhost systemd[1]: Starting nova_virtnodedevd container... Nov 23 03:07:15 localhost tripleo-start-podman-container[62826]: Creating additional drop-in dependency for "nova_virtnodedevd" (aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd) Nov 23 03:07:15 localhost systemd[1]: Reloading. Nov 23 03:07:15 localhost systemd-sysv-generator[62886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:15 localhost systemd-rc-local-generator[62883]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:15 localhost systemd[1]: Started nova_virtnodedevd container. Nov 23 03:07:16 localhost python3[62910]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:16 localhost systemd[1]: Reloading. Nov 23 03:07:16 localhost systemd-rc-local-generator[62937]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:16 localhost systemd-sysv-generator[62942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:16 localhost systemd[1]: Starting nova_virtproxyd container... Nov 23 03:07:16 localhost tripleo-start-podman-container[62950]: Creating additional drop-in dependency for "nova_virtproxyd" (108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08) Nov 23 03:07:16 localhost systemd[1]: Reloading. Nov 23 03:07:17 localhost systemd-sysv-generator[63009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:17 localhost systemd-rc-local-generator[63004]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:17 localhost systemd[1]: Started nova_virtproxyd container. Nov 23 03:07:17 localhost python3[63033]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:17 localhost systemd[1]: Reloading. Nov 23 03:07:17 localhost systemd-rc-local-generator[63057]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:17 localhost systemd-sysv-generator[63062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:18 localhost systemd[1]: Starting nova_virtqemud container... Nov 23 03:07:18 localhost tripleo-start-podman-container[63072]: Creating additional drop-in dependency for "nova_virtqemud" (80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8) Nov 23 03:07:18 localhost systemd[1]: Reloading. Nov 23 03:07:18 localhost systemd-sysv-generator[63133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:18 localhost systemd-rc-local-generator[63127]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:18 localhost systemd[1]: Started nova_virtqemud container. Nov 23 03:07:19 localhost python3[63154]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:19 localhost systemd[1]: Reloading. Nov 23 03:07:19 localhost systemd-sysv-generator[63185]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:19 localhost systemd-rc-local-generator[63180]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:19 localhost systemd[1]: Starting nova_virtsecretd container... Nov 23 03:07:20 localhost tripleo-start-podman-container[63193]: Creating additional drop-in dependency for "nova_virtsecretd" (a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03) Nov 23 03:07:20 localhost systemd[1]: Reloading. Nov 23 03:07:20 localhost systemd-sysv-generator[63250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:20 localhost systemd-rc-local-generator[63247]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:20 localhost systemd[1]: Started nova_virtsecretd container. Nov 23 03:07:20 localhost python3[63277]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:21 localhost systemd[1]: Reloading. Nov 23 03:07:21 localhost systemd-rc-local-generator[63304]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:21 localhost systemd-sysv-generator[63308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:21 localhost systemd[1]: Starting nova_virtstoraged container... Nov 23 03:07:21 localhost tripleo-start-podman-container[63317]: Creating additional drop-in dependency for "nova_virtstoraged" (33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a) Nov 23 03:07:21 localhost systemd[1]: Reloading. Nov 23 03:07:21 localhost systemd-rc-local-generator[63373]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:21 localhost systemd-sysv-generator[63377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:21 localhost systemd[1]: Started nova_virtstoraged container. Nov 23 03:07:22 localhost python3[63402]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:07:22 localhost systemd[1]: Reloading. Nov 23 03:07:22 localhost systemd-rc-local-generator[63428]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:07:22 localhost systemd-sysv-generator[63433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:07:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:07:22 localhost systemd[1]: Starting rsyslog container... Nov 23 03:07:23 localhost systemd[1]: Started libcrun container. Nov 23 03:07:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:23 localhost podman[63442]: 2025-11-23 08:07:23.037486887 +0000 UTC m=+0.125305334 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:49Z, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, container_name=rsyslog, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog) Nov 23 03:07:23 localhost podman[63442]: 2025-11-23 08:07:23.049700496 +0000 UTC m=+0.137518913 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64) Nov 23 03:07:23 localhost podman[63442]: rsyslog Nov 23 03:07:23 localhost systemd[1]: Started rsyslog container. Nov 23 03:07:23 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:07:23 localhost podman[63477]: 2025-11-23 08:07:23.218400078 +0000 UTC m=+0.054188568 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Nov 23 03:07:23 localhost podman[63477]: 2025-11-23 08:07:23.244864347 +0000 UTC m=+0.080652796 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3) Nov 23 03:07:23 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:07:23 localhost podman[63490]: 2025-11-23 08:07:23.336248026 +0000 UTC m=+0.063147568 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 23 03:07:23 localhost podman[63490]: rsyslog Nov 23 03:07:23 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:23 localhost python3[63516]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:23 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Nov 23 03:07:23 localhost systemd[1]: Stopped rsyslog container. Nov 23 03:07:23 localhost systemd[1]: Starting rsyslog container... Nov 23 03:07:23 localhost systemd[1]: Started libcrun container. Nov 23 03:07:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:23 localhost podman[63517]: 2025-11-23 08:07:23.810543843 +0000 UTC m=+0.116479367 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, container_name=rsyslog, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true) Nov 23 03:07:23 localhost podman[63517]: 2025-11-23 08:07:23.820588067 +0000 UTC m=+0.126523551 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:07:23 localhost podman[63517]: rsyslog Nov 23 03:07:23 localhost systemd[1]: Started rsyslog container. Nov 23 03:07:23 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:07:23 localhost podman[63572]: 2025-11-23 08:07:23.969595504 +0000 UTC m=+0.036232164 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z) Nov 23 03:07:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully. Nov 23 03:07:23 localhost systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully. Nov 23 03:07:23 localhost podman[63572]: 2025-11-23 08:07:23.994559408 +0000 UTC m=+0.061196068 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:07:23 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:07:24 localhost podman[63599]: 2025-11-23 08:07:24.081272456 +0000 UTC m=+0.059899780 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 23 03:07:24 localhost podman[63599]: rsyslog Nov 23 03:07:24 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:24 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Nov 23 03:07:24 localhost systemd[1]: Stopped rsyslog container. Nov 23 03:07:24 localhost systemd[1]: Starting rsyslog container... Nov 23 03:07:24 localhost systemd[1]: Started libcrun container. Nov 23 03:07:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:24 localhost podman[63655]: 2025-11-23 08:07:24.529505646 +0000 UTC m=+0.119716365 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, container_name=rsyslog, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-rsyslog) Nov 23 03:07:24 localhost podman[63655]: 2025-11-23 08:07:24.539172468 +0000 UTC m=+0.129383197 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 23 03:07:24 localhost podman[63655]: rsyslog Nov 23 03:07:24 localhost systemd[1]: Started rsyslog container. Nov 23 03:07:24 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:07:24 localhost podman[63690]: 2025-11-23 08:07:24.705100816 +0000 UTC m=+0.040910635 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-rsyslog, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, architecture=x86_64, release=1761123044) Nov 23 03:07:24 localhost podman[63690]: 2025-11-23 08:07:24.730656949 +0000 UTC m=+0.066466728 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Nov 23 03:07:24 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:07:24 localhost podman[63707]: 2025-11-23 08:07:24.824064778 +0000 UTC m=+0.059491247 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:07:24 localhost podman[63707]: rsyslog Nov 23 03:07:24 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:24 localhost systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully. Nov 23 03:07:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully. Nov 23 03:07:24 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Nov 23 03:07:24 localhost systemd[1]: Stopped rsyslog container. Nov 23 03:07:24 localhost python3[63729]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005532585 step=3 update_config_hash_only=False Nov 23 03:07:24 localhost systemd[1]: Starting rsyslog container... Nov 23 03:07:25 localhost systemd[1]: Started libcrun container. Nov 23 03:07:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:25 localhost podman[63732]: 2025-11-23 08:07:25.12028107 +0000 UTC m=+0.113617561 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=) Nov 23 03:07:25 localhost podman[63732]: 2025-11-23 08:07:25.129769166 +0000 UTC m=+0.123105657 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3) Nov 23 03:07:25 localhost podman[63732]: rsyslog Nov 23 03:07:25 localhost systemd[1]: Started rsyslog container. Nov 23 03:07:25 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:07:25 localhost podman[63754]: 2025-11-23 08:07:25.280393173 +0000 UTC m=+0.043606658 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.openshift.expose-services=, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 03:07:25 localhost podman[63754]: 2025-11-23 08:07:25.2965135 +0000 UTC m=+0.059726965 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, release=1761123044, vendor=Red Hat, Inc., container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Nov 23 03:07:25 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:07:25 localhost podman[63767]: 2025-11-23 08:07:25.376999159 +0000 UTC m=+0.045068431 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-rsyslog, release=1761123044, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, container_name=rsyslog, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true) Nov 23 03:07:25 localhost podman[63767]: rsyslog Nov 23 03:07:25 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:25 localhost python3[63793]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:07:25 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Nov 23 03:07:25 localhost systemd[1]: Stopped rsyslog container. Nov 23 03:07:25 localhost systemd[1]: Starting rsyslog container... Nov 23 03:07:25 localhost systemd[1]: Started libcrun container. Nov 23 03:07:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 23 03:07:25 localhost podman[63794]: 2025-11-23 08:07:25.820860658 +0000 UTC m=+0.122430467 container init 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, container_name=rsyslog) Nov 23 03:07:25 localhost podman[63794]: 2025-11-23 08:07:25.828019134 +0000 UTC m=+0.129588943 container start 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:07:25 localhost podman[63794]: rsyslog Nov 23 03:07:25 localhost systemd[1]: Started rsyslog container. Nov 23 03:07:25 localhost systemd[1]: libpod-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a.scope: Deactivated successfully. Nov 23 03:07:25 localhost podman[63832]: 2025-11-23 08:07:25.982741725 +0000 UTC m=+0.043215476 container died 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, container_name=rsyslog, vcs-type=git, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 23 03:07:25 localhost python3[63826]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 03:07:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a-userdata-shm.mount: Deactivated successfully. Nov 23 03:07:26 localhost systemd[1]: var-lib-containers-storage-overlay-284ebc1e1293032db8a829276036406416004830cdfe0e4a65cbc1c6441b129d-merged.mount: Deactivated successfully. Nov 23 03:07:26 localhost podman[63832]: 2025-11-23 08:07:26.006465911 +0000 UTC m=+0.066939632 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, container_name=rsyslog, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git) Nov 23 03:07:26 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:07:26 localhost podman[63845]: 2025-11-23 08:07:26.094403885 +0000 UTC m=+0.056166046 container cleanup 8d8f4158ca6c8b07632c1a47bd0aa0e7f6d014396afeaace3fae240f154aad0a (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7238f2997345c97f4c6ab424e622dc1b'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, release=1761123044, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 23 03:07:26 localhost podman[63845]: rsyslog Nov 23 03:07:26 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:26 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Nov 23 03:07:26 localhost systemd[1]: Stopped rsyslog container. Nov 23 03:07:26 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Nov 23 03:07:26 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 23 03:07:26 localhost systemd[1]: Failed to start rsyslog container. Nov 23 03:07:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:07:28 localhost systemd[1]: tmp-crun.uwOAks.mount: Deactivated successfully. Nov 23 03:07:28 localhost podman[63858]: 2025-11-23 08:07:28.032028276 +0000 UTC m=+0.090182233 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:07:28 localhost podman[63858]: 2025-11-23 08:07:28.070321703 +0000 UTC m=+0.128475660 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, container_name=collectd) Nov 23 03:07:28 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:07:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:07:30 localhost podman[63880]: 2025-11-23 08:07:30.017339286 +0000 UTC m=+0.074993905 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:07:30 localhost podman[63880]: 2025-11-23 08:07:30.029305347 +0000 UTC m=+0.086959936 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z) Nov 23 03:07:30 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:07:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:07:35 localhost systemd[1]: tmp-crun.KlA56n.mount: Deactivated successfully. Nov 23 03:07:35 localhost podman[63899]: 2025-11-23 08:07:35.026190677 +0000 UTC m=+0.083985146 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Nov 23 03:07:35 localhost podman[63899]: 2025-11-23 08:07:35.216953565 +0000 UTC m=+0.274748044 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:07:35 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:07:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:07:59 localhost podman[64006]: 2025-11-23 08:07:59.030566791 +0000 UTC m=+0.085159941 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 03:07:59 localhost podman[64006]: 2025-11-23 08:07:59.042141231 +0000 UTC m=+0.096734311 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-type=git) Nov 23 03:07:59 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:08:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:08:01 localhost systemd[1]: tmp-crun.DjTdEN.mount: Deactivated successfully. Nov 23 03:08:01 localhost podman[64023]: 2025-11-23 08:08:01.024381399 +0000 UTC m=+0.077992196 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, url=https://www.redhat.com) Nov 23 03:08:01 localhost podman[64023]: 2025-11-23 08:08:01.057125477 +0000 UTC m=+0.110736244 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:08:01 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:08:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:08:06 localhost podman[64044]: 2025-11-23 08:08:06.018342729 +0000 UTC m=+0.075859490 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:08:06 localhost podman[64044]: 2025-11-23 08:08:06.204236521 +0000 UTC m=+0.261753362 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:08:06 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:08:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:08:30 localhost podman[64072]: 2025-11-23 08:08:30.021406727 +0000 UTC m=+0.079639395 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:08:30 localhost podman[64072]: 2025-11-23 08:08:30.057574399 +0000 UTC m=+0.115807057 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:08:30 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:08:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:08:32 localhost podman[64092]: 2025-11-23 08:08:32.014489622 +0000 UTC m=+0.077235243 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, distribution-scope=public) Nov 23 03:08:32 localhost podman[64092]: 2025-11-23 08:08:32.026323269 +0000 UTC m=+0.089068890 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1) Nov 23 03:08:32 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:08:37 localhost podman[64111]: 2025-11-23 08:08:37.020755124 +0000 UTC m=+0.077218483 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z) Nov 23 03:08:37 localhost podman[64111]: 2025-11-23 08:08:37.219283848 +0000 UTC m=+0.275747146 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 23 03:08:37 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:09:00 localhost sshd[64217]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:09:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:09:01 localhost podman[64218]: 2025-11-23 08:09:01.042143971 +0000 UTC m=+0.093225325 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3) Nov 23 03:09:01 localhost podman[64218]: 2025-11-23 08:09:01.049546525 +0000 UTC m=+0.100627869 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, release=1761123044) Nov 23 03:09:01 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:09:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:09:03 localhost systemd[1]: tmp-crun.63uZ70.mount: Deactivated successfully. Nov 23 03:09:03 localhost podman[64240]: 2025-11-23 08:09:03.015588654 +0000 UTC m=+0.073839920 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4) Nov 23 03:09:03 localhost podman[64240]: 2025-11-23 08:09:03.054393285 +0000 UTC m=+0.112644521 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:09:03 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:09:08 localhost podman[64260]: 2025-11-23 08:09:08.028352822 +0000 UTC m=+0.085559423 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:09:08 localhost podman[64260]: 2025-11-23 08:09:08.223566705 +0000 UTC m=+0.280773296 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:09:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:09:23 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=100.29.192.117 DST=38.102.83.198 LEN=44 TOS=0x00 PREC=0x00 TTL=243 ID=54321 PROTO=TCP SPT=19431 DPT=9090 SEQ=0 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B4) Nov 23 03:09:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:09:32 localhost systemd[1]: tmp-crun.ikiX8n.mount: Deactivated successfully. Nov 23 03:09:32 localhost podman[64289]: 2025-11-23 08:09:32.036961598 +0000 UTC m=+0.092239385 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Nov 23 03:09:32 localhost podman[64289]: 2025-11-23 08:09:32.046348361 +0000 UTC m=+0.101626158 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, distribution-scope=public, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 23 03:09:32 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:09:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:09:34 localhost podman[64309]: 2025-11-23 08:09:34.02567698 +0000 UTC m=+0.079828930 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:09:34 localhost podman[64309]: 2025-11-23 08:09:34.03459856 +0000 UTC m=+0.088750540 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:09:34 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:09:39 localhost podman[64329]: 2025-11-23 08:09:39.032844055 +0000 UTC m=+0.090199360 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:09:39 localhost podman[64329]: 2025-11-23 08:09:39.222155484 +0000 UTC m=+0.279510739 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Nov 23 03:09:39 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:10:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:10:03 localhost systemd[1]: tmp-crun.U8eznj.mount: Deactivated successfully. Nov 23 03:10:03 localhost podman[64433]: 2025-11-23 08:10:03.029421493 +0000 UTC m=+0.090405076 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true) Nov 23 03:10:03 localhost podman[64433]: 2025-11-23 08:10:03.035814828 +0000 UTC m=+0.096798411 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, release=1761123044) Nov 23 03:10:03 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:10:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:10:05 localhost systemd[1]: tmp-crun.NUhv9U.mount: Deactivated successfully. Nov 23 03:10:05 localhost podman[64453]: 2025-11-23 08:10:05.012540074 +0000 UTC m=+0.070179879 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:10:05 localhost podman[64453]: 2025-11-23 08:10:05.021614121 +0000 UTC m=+0.079253996 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:10:05 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:10:10 localhost podman[64473]: 2025-11-23 08:10:10.019021489 +0000 UTC m=+0.077239825 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:10:10 localhost podman[64473]: 2025-11-23 08:10:10.23104757 +0000 UTC m=+0.289265866 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Nov 23 03:10:10 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:10:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:10:34 localhost systemd[1]: tmp-crun.lqka5K.mount: Deactivated successfully. Nov 23 03:10:34 localhost podman[64501]: 2025-11-23 08:10:34.022636871 +0000 UTC m=+0.084193867 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:10:34 localhost podman[64501]: 2025-11-23 08:10:34.032114101 +0000 UTC m=+0.093671017 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:10:34 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:10:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:10:36 localhost podman[64521]: 2025-11-23 08:10:36.018850851 +0000 UTC m=+0.076905785 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team) Nov 23 03:10:36 localhost podman[64521]: 2025-11-23 08:10:36.026562217 +0000 UTC m=+0.084617121 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, release=1761123044, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:10:36 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:10:41 localhost systemd[1]: tmp-crun.rglPxe.mount: Deactivated successfully. Nov 23 03:10:41 localhost podman[64540]: 2025-11-23 08:10:41.015866705 +0000 UTC m=+0.077319438 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:10:41 localhost podman[64540]: 2025-11-23 08:10:41.236417216 +0000 UTC m=+0.297869969 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:10:41 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:11:04 localhost sshd[64697]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:11:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:11:05 localhost podman[64698]: 2025-11-23 08:11:05.032166343 +0000 UTC m=+0.082595848 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:11:05 localhost podman[64698]: 2025-11-23 08:11:05.072381399 +0000 UTC m=+0.122810894 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3) Nov 23 03:11:05 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:11:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:11:07 localhost podman[64719]: 2025-11-23 08:11:07.025513694 +0000 UTC m=+0.078706390 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 23 03:11:07 localhost podman[64719]: 2025-11-23 08:11:07.034241309 +0000 UTC m=+0.087433945 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:11:07 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:11:12 localhost podman[64738]: 2025-11-23 08:11:12.013774721 +0000 UTC m=+0.074946336 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 23 03:11:12 localhost podman[64738]: 2025-11-23 08:11:12.233372783 +0000 UTC m=+0.294544368 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:11:12 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:11:24 localhost python3[64814]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:24 localhost python3[64859]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885484.2567315-107579-259519731031558/source _original_basename=tmpkz8lkwxt follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:25 localhost python3[64921]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:26 localhost python3[64964]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885485.5420556-107660-163651403600910/source _original_basename=tmpejbk836s follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:26 localhost python3[65026]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:27 localhost python3[65069]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885486.4869795-107749-183953673592417/source _original_basename=tmp_r7_govb follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:27 localhost python3[65131]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:28 localhost python3[65174]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885487.342244-107815-178809715685081/source _original_basename=tmpbo677tha follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:28 localhost python3[65204]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 03:11:28 localhost systemd[1]: Reloading. Nov 23 03:11:28 localhost systemd-rc-local-generator[65229]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:28 localhost systemd-sysv-generator[65232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:28 localhost systemd[1]: Reloading. Nov 23 03:11:29 localhost systemd-sysv-generator[65272]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:29 localhost systemd-rc-local-generator[65268]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:29 localhost python3[65296]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:11:29 localhost systemd[1]: Reloading. Nov 23 03:11:29 localhost systemd-rc-local-generator[65321]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:29 localhost systemd-sysv-generator[65324]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:30 localhost systemd[1]: Reloading. Nov 23 03:11:30 localhost systemd-sysv-generator[65363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:30 localhost systemd-rc-local-generator[65358]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:30 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Nov 23 03:11:30 localhost python3[65386]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:11:30 localhost systemd[1]: Reloading. Nov 23 03:11:30 localhost systemd-rc-local-generator[65411]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:30 localhost systemd-sysv-generator[65414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:31 localhost python3[65470]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:31 localhost python3[65513]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885491.1600728-107951-109599284716971/source _original_basename=tmp6m_gek2v follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:32 localhost python3[65543]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:11:32 localhost systemd[1]: Reloading. Nov 23 03:11:32 localhost systemd-sysv-generator[65575]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:32 localhost systemd-rc-local-generator[65568]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:32 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Nov 23 03:11:33 localhost python3[65597]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:11:34 localhost ansible-async_wrapper.py[65769]: Invoked with 594696035795 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885494.1805284-108061-154113572429303/AnsiballZ_command.py _ Nov 23 03:11:34 localhost ansible-async_wrapper.py[65772]: Starting module and watcher Nov 23 03:11:34 localhost ansible-async_wrapper.py[65772]: Start watching 65773 (3600) Nov 23 03:11:34 localhost ansible-async_wrapper.py[65773]: Start module (65773) Nov 23 03:11:34 localhost ansible-async_wrapper.py[65769]: Return async_wrapper task started. Nov 23 03:11:35 localhost python3[65793]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:11:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:11:36 localhost podman[65807]: 2025-11-23 08:11:36.025583924 +0000 UTC m=+0.080427643 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, distribution-scope=public, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=) Nov 23 03:11:36 localhost podman[65807]: 2025-11-23 08:11:36.03924452 +0000 UTC m=+0.094088259 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:11:36 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:11:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:11:37 localhost podman[65863]: 2025-11-23 08:11:37.41630917 +0000 UTC m=+0.084915140 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:11:37 localhost podman[65863]: 2025-11-23 08:11:37.422118127 +0000 UTC m=+0.090724077 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Nov 23 03:11:37 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:11:38 localhost puppet-user[65792]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 03:11:38 localhost puppet-user[65792]: (file: /etc/puppet/hiera.yaml) Nov 23 03:11:38 localhost puppet-user[65792]: Warning: Undefined variable '::deploy_config_name'; Nov 23 03:11:38 localhost puppet-user[65792]: (file & line not available) Nov 23 03:11:38 localhost puppet-user[65792]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 03:11:38 localhost puppet-user[65792]: (file & line not available) Nov 23 03:11:38 localhost puppet-user[65792]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:11:38 localhost puppet-user[65792]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:11:38 localhost puppet-user[65792]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:11:38 localhost puppet-user[65792]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:11:38 localhost puppet-user[65792]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:11:38 localhost puppet-user[65792]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:11:38 localhost puppet-user[65792]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:11:38 localhost puppet-user[65792]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 23 03:11:38 localhost puppet-user[65792]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.21 seconds Nov 23 03:11:39 localhost ansible-async_wrapper.py[65772]: 65773 still running (3600) Nov 23 03:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:11:43 localhost podman[65952]: 2025-11-23 08:11:43.029630926 +0000 UTC m=+0.082438613 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:11:43 localhost podman[65952]: 2025-11-23 08:11:43.217723398 +0000 UTC m=+0.270531005 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:11:43 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:11:44 localhost ansible-async_wrapper.py[65772]: 65773 still running (3595) Nov 23 03:11:45 localhost python3[66056]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:11:47 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 03:11:47 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 03:11:47 localhost systemd[1]: Reloading. Nov 23 03:11:47 localhost systemd-sysv-generator[66150]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:47 localhost systemd-rc-local-generator[66147]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:47 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 03:11:48 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 03:11:48 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 03:11:48 localhost systemd[1]: man-db-cache-update.service: Consumed 1.127s CPU time. Nov 23 03:11:48 localhost systemd[1]: run-ra987bbe231014b0d8198b24404565d19.service: Deactivated successfully. Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}10c7cbde53c4e0c35154236143cc1ebe7f111a329076e2d1cfa3e9aea340c260' Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Nov 23 03:11:48 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Nov 23 03:11:49 localhost ansible-async_wrapper.py[65772]: 65773 still running (3590) Nov 23 03:11:53 localhost puppet-user[65792]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Nov 23 03:11:54 localhost systemd[1]: Reloading. Nov 23 03:11:54 localhost systemd-rc-local-generator[67446]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:54 localhost systemd-sysv-generator[67449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:54 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Nov 23 03:11:54 localhost snmpd[67457]: Can't find directory of RPM packages Nov 23 03:11:54 localhost snmpd[67457]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Nov 23 03:11:54 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Nov 23 03:11:54 localhost systemd[1]: Reloading. Nov 23 03:11:54 localhost systemd-rc-local-generator[67481]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:54 localhost systemd-sysv-generator[67487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:54 localhost ansible-async_wrapper.py[65772]: 65773 still running (3585) Nov 23 03:11:54 localhost systemd[1]: Reloading. Nov 23 03:11:54 localhost systemd-rc-local-generator[67519]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:11:54 localhost systemd-sysv-generator[67522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:11:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:11:55 localhost puppet-user[65792]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Nov 23 03:11:55 localhost puppet-user[65792]: Notice: Applied catalog in 16.41 seconds Nov 23 03:11:55 localhost puppet-user[65792]: Application: Nov 23 03:11:55 localhost puppet-user[65792]: Initial environment: production Nov 23 03:11:55 localhost puppet-user[65792]: Converged environment: production Nov 23 03:11:55 localhost puppet-user[65792]: Run mode: user Nov 23 03:11:55 localhost puppet-user[65792]: Changes: Nov 23 03:11:55 localhost puppet-user[65792]: Total: 8 Nov 23 03:11:55 localhost puppet-user[65792]: Events: Nov 23 03:11:55 localhost puppet-user[65792]: Success: 8 Nov 23 03:11:55 localhost puppet-user[65792]: Total: 8 Nov 23 03:11:55 localhost puppet-user[65792]: Resources: Nov 23 03:11:55 localhost puppet-user[65792]: Restarted: 1 Nov 23 03:11:55 localhost puppet-user[65792]: Changed: 8 Nov 23 03:11:55 localhost puppet-user[65792]: Out of sync: 8 Nov 23 03:11:55 localhost puppet-user[65792]: Total: 19 Nov 23 03:11:55 localhost puppet-user[65792]: Time: Nov 23 03:11:55 localhost puppet-user[65792]: Filebucket: 0.00 Nov 23 03:11:55 localhost puppet-user[65792]: Schedule: 0.00 Nov 23 03:11:55 localhost puppet-user[65792]: Augeas: 0.01 Nov 23 03:11:55 localhost puppet-user[65792]: File: 0.10 Nov 23 03:11:55 localhost puppet-user[65792]: Config retrieval: 0.27 Nov 23 03:11:55 localhost puppet-user[65792]: Service: 1.23 Nov 23 03:11:55 localhost puppet-user[65792]: Transaction evaluation: 16.40 Nov 23 03:11:55 localhost puppet-user[65792]: Catalog application: 16.41 Nov 23 03:11:55 localhost puppet-user[65792]: Last run: 1763885515 Nov 23 03:11:55 localhost puppet-user[65792]: Exec: 5.07 Nov 23 03:11:55 localhost puppet-user[65792]: Package: 9.83 Nov 23 03:11:55 localhost puppet-user[65792]: Total: 16.41 Nov 23 03:11:55 localhost puppet-user[65792]: Version: Nov 23 03:11:55 localhost puppet-user[65792]: Config: 1763885498 Nov 23 03:11:55 localhost puppet-user[65792]: Puppet: 7.10.0 Nov 23 03:11:55 localhost ansible-async_wrapper.py[65773]: Module complete (65773) Nov 23 03:11:55 localhost python3[67561]: ansible-ansible.legacy.async_status Invoked with jid=594696035795.65769 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:11:56 localhost python3[67577]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:11:56 localhost python3[67593]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:11:57 localhost python3[67643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:11:57 localhost python3[67661]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpjqih1ckc recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:11:57 localhost python3[67691]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:11:58 localhost python3[67794]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 23 03:11:59 localhost ansible-async_wrapper.py[65772]: Done in kid B. Nov 23 03:11:59 localhost python3[67813]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:00 localhost python3[67845]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4446 writes, 20K keys, 4446 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4446 writes, 451 syncs, 9.86 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8 writes, 16 keys, 8 commit groups, 1.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 8 writes, 4 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:12:01 localhost python3[67895]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:01 localhost python3[67913]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:02 localhost python3[67975]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:02 localhost python3[67993]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:02 localhost python3[68055]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:03 localhost python3[68073]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:03 localhost python3[68135]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:03 localhost python3[68153]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:04 localhost python3[68183]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:04 localhost systemd[1]: Reloading. Nov 23 03:12:04 localhost systemd-sysv-generator[68209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:04 localhost systemd-rc-local-generator[68205]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5196 writes, 22K keys, 5196 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5196 writes, 612 syncs, 8.49 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 72 writes, 104 keys, 72 commit groups, 1.0 writes per commit group, ingest: 0.03 MB, 0.00 MB/s#012Interval WAL: 73 writes, 37 syncs, 1.97 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:12:05 localhost python3[68269]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:05 localhost python3[68287]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:06 localhost python3[68349]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:12:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:12:06 localhost systemd[1]: tmp-crun.69qU5i.mount: Deactivated successfully. Nov 23 03:12:06 localhost podman[68368]: 2025-11-23 08:12:06.335135351 +0000 UTC m=+0.096543354 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true) Nov 23 03:12:06 localhost podman[68368]: 2025-11-23 08:12:06.375284044 +0000 UTC m=+0.136692057 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z) Nov 23 03:12:06 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:12:06 localhost python3[68367]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:06 localhost python3[68416]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:06 localhost systemd[1]: Reloading. Nov 23 03:12:07 localhost systemd-rc-local-generator[68442]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:07 localhost systemd-sysv-generator[68446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:07 localhost systemd[1]: Starting Create netns directory... Nov 23 03:12:07 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 03:12:07 localhost systemd[1]: Finished Create netns directory. Nov 23 03:12:07 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:12:07 localhost podman[68472]: 2025-11-23 08:12:07.719809132 +0000 UTC m=+0.092643425 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid) Nov 23 03:12:07 localhost podman[68472]: 2025-11-23 08:12:07.735384756 +0000 UTC m=+0.108219079 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:12:07 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:12:07 localhost python3[68473]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 03:12:09 localhost python3[68551]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 23 03:12:09 localhost podman[68702]: 2025-11-23 08:12:09.798396842 +0000 UTC m=+0.070610873 container create d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Nov 23 03:12:09 localhost podman[68732]: 2025-11-23 08:12:09.827087656 +0000 UTC m=+0.068677614 container create 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:12:09 localhost podman[68727]: 2025-11-23 08:12:09.845141547 +0000 UTC m=+0.091069357 container create b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller) Nov 23 03:12:09 localhost systemd[1]: Started libpod-conmon-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope. Nov 23 03:12:09 localhost podman[68702]: 2025-11-23 08:12:09.763272831 +0000 UTC m=+0.035486892 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 23 03:12:09 localhost podman[68731]: 2025-11-23 08:12:09.868176278 +0000 UTC m=+0.110168119 container create c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 03:12:09 localhost systemd[1]: Started libpod-conmon-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope. Nov 23 03:12:09 localhost systemd[1]: Started libpod-conmon-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope. Nov 23 03:12:09 localhost systemd[1]: Started libcrun container. Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost systemd[1]: Started libcrun container. Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost systemd[1]: Started libcrun container. Nov 23 03:12:09 localhost podman[68749]: 2025-11-23 08:12:09.889556339 +0000 UTC m=+0.119077979 container create 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:12:09 localhost podman[68731]: 2025-11-23 08:12:09.790601594 +0000 UTC m=+0.032593445 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 23 03:12:09 localhost systemd[1]: Started libpod-conmon-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope. Nov 23 03:12:09 localhost podman[68732]: 2025-11-23 08:12:09.794740281 +0000 UTC m=+0.036330209 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 23 03:12:09 localhost podman[68727]: 2025-11-23 08:12:09.897377178 +0000 UTC m=+0.143304988 container init b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1761123044, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc.) Nov 23 03:12:09 localhost podman[68727]: 2025-11-23 08:12:09.797874986 +0000 UTC m=+0.043802816 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 03:12:09 localhost podman[68727]: 2025-11-23 08:12:09.906212717 +0000 UTC m=+0.152140557 container start b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, container_name=configure_cms_options, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:12:09 localhost podman[68727]: 2025-11-23 08:12:09.906672981 +0000 UTC m=+0.152600811 container attach b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4) Nov 23 03:12:09 localhost systemd[1]: Started libcrun container. Nov 23 03:12:09 localhost systemd[1]: Started libpod-conmon-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope. Nov 23 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c95810012fd778200d1d5a4bd21660a3c4c1d69fbb62fb4075810a9d5203bef5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost podman[68702]: 2025-11-23 08:12:09.920026498 +0000 UTC m=+0.192240519 container init d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:12:09 localhost podman[68731]: 2025-11-23 08:12:09.924133844 +0000 UTC m=+0.166125675 container init c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1) Nov 23 03:12:09 localhost systemd[1]: Started libcrun container. Nov 23 03:12:09 localhost podman[68731]: 2025-11-23 08:12:09.932515899 +0000 UTC m=+0.174507760 container start c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=nova_libvirt_init_secret, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt) Nov 23 03:12:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43834aabac3051c95b0bd48b6a3d7296604e45656eac8be0b6aa4803a8bc68b2/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:09 localhost podman[68731]: 2025-11-23 08:12:09.932812458 +0000 UTC m=+0.174804289 container attach c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, version=17.1.12, config_id=tripleo_step4) Nov 23 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:12:09 localhost podman[68702]: 2025-11-23 08:12:09.936322905 +0000 UTC m=+0.208536926 container start d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:12:09 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1bd1f352f264f24512a1a2440e47a1f5 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 23 03:12:09 localhost podman[68749]: 2025-11-23 08:12:09.849629063 +0000 UTC m=+0.079150713 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 23 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:12:09 localhost podman[68732]: 2025-11-23 08:12:09.966282218 +0000 UTC m=+0.207872176 container init 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:12:09 localhost podman[68749]: 2025-11-23 08:12:09.969157366 +0000 UTC m=+0.198679046 container init 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:12:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:12:10 localhost podman[68749]: 2025-11-23 08:12:10.00704761 +0000 UTC m=+0.236569250 container start 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible) Nov 23 03:12:10 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1bd1f352f264f24512a1a2440e47a1f5 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 23 03:12:10 localhost ovs-vsctl[68863]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Nov 23 03:12:10 localhost systemd[1]: libpod-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope: Deactivated successfully. Nov 23 03:12:10 localhost podman[68727]: 2025-11-23 08:12:10.025037949 +0000 UTC m=+0.270965779 container died b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=configure_cms_options, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:12:10 localhost podman[68732]: 2025-11-23 08:12:10.042564313 +0000 UTC m=+0.284154221 container start 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64) Nov 23 03:12:10 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 23 03:12:10 localhost systemd[1]: libpod-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope: Deactivated successfully. Nov 23 03:12:10 localhost podman[68854]: 2025-11-23 08:12:10.1431923 +0000 UTC m=+0.129488787 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Nov 23 03:12:10 localhost podman[68882]: 2025-11-23 08:12:10.166783319 +0000 UTC m=+0.130041174 container cleanup b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 23 03:12:10 localhost podman[68843]: 2025-11-23 08:12:10.065570214 +0000 UTC m=+0.077165472 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:12:10 localhost systemd[1]: libpod-conmon-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1.scope: Deactivated successfully. Nov 23 03:12:10 localhost podman[68811]: 2025-11-23 08:12:10.025352419 +0000 UTC m=+0.082006330 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:12:10 localhost podman[68854]: 2025-11-23 08:12:10.182192239 +0000 UTC m=+0.168488726 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044) Nov 23 03:12:10 localhost podman[68854]: unhealthy Nov 23 03:12:10 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:12:10 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'. Nov 23 03:12:10 localhost podman[68843]: 2025-11-23 08:12:10.200227488 +0000 UTC m=+0.211822766 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Nov 23 03:12:10 localhost podman[68811]: 2025-11-23 08:12:10.210145551 +0000 UTC m=+0.266799482 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:12:10 localhost podman[68811]: unhealthy Nov 23 03:12:10 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:12:10 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:12:10 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed with result 'exit-code'. Nov 23 03:12:10 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Nov 23 03:12:10 localhost podman[68731]: 2025-11-23 08:12:10.278856614 +0000 UTC m=+0.520848465 container died c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:12:10 localhost podman[68918]: 2025-11-23 08:12:10.415180579 +0000 UTC m=+0.329676979 container cleanup c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, container_name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:35:22Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container) Nov 23 03:12:10 localhost systemd[1]: libpod-conmon-c0a5d02b387b6765c514c636a15092bee1bd1dfd1469ed56120fe099d9455e6b.scope: Deactivated successfully. Nov 23 03:12:10 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Nov 23 03:12:10 localhost podman[69077]: 2025-11-23 08:12:10.514133625 +0000 UTC m=+0.070955023 container create e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=) Nov 23 03:12:10 localhost podman[69089]: 2025-11-23 08:12:10.540755257 +0000 UTC m=+0.079281237 container create 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:12:10 localhost systemd[1]: Started libpod-conmon-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope. Nov 23 03:12:10 localhost systemd[1]: Started libpod-conmon-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope. Nov 23 03:12:10 localhost systemd[1]: Started libcrun container. Nov 23 03:12:10 localhost systemd[1]: Started libcrun container. Nov 23 03:12:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:10 localhost podman[69077]: 2025-11-23 08:12:10.474198488 +0000 UTC m=+0.031019916 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:12:10 localhost podman[69089]: 2025-11-23 08:12:10.578769905 +0000 UTC m=+0.117295885 container init 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 23 03:12:10 localhost podman[69089]: 2025-11-23 08:12:10.586748438 +0000 UTC m=+0.125274428 container start 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:12:10 localhost podman[69089]: 2025-11-23 08:12:10.586965035 +0000 UTC m=+0.125491045 container attach 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, distribution-scope=public, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:12:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:12:10 localhost podman[69077]: 2025-11-23 08:12:10.589286445 +0000 UTC m=+0.146107863 container init e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:12:10 localhost podman[69089]: 2025-11-23 08:12:10.492136914 +0000 UTC m=+0.030662944 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 03:12:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:12:10 localhost podman[69077]: 2025-11-23 08:12:10.614006999 +0000 UTC m=+0.170828407 container start e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:12:10 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:12:10 localhost podman[69145]: 2025-11-23 08:12:10.750042916 +0000 UTC m=+0.129424717 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:12:10 localhost systemd[1]: var-lib-containers-storage-overlay-63ab7b1688305e5f88e4974e557ea0bb87f0e73ce1236c8e61a48437546ff0ac-merged.mount: Deactivated successfully. Nov 23 03:12:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4e69aa47f38c02c76a4d9a5fe6904664766b38b13604b993df21cf4a8a91bd1-userdata-shm.mount: Deactivated successfully. Nov 23 03:12:11 localhost podman[69145]: 2025-11-23 08:12:11.094660048 +0000 UTC m=+0.474041819 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:12:11 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:12:11 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Nov 23 03:12:13 localhost ovs-vsctl[69322]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Nov 23 03:12:13 localhost systemd[1]: libpod-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Deactivated successfully. Nov 23 03:12:13 localhost systemd[1]: libpod-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Consumed 2.918s CPU time. Nov 23 03:12:13 localhost podman[69089]: 2025-11-23 08:12:13.586389569 +0000 UTC m=+3.124915609 container died 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:12:13 localhost systemd[1]: tmp-crun.AyrjvV.mount: Deactivated successfully. Nov 23 03:12:13 localhost podman[69324]: 2025-11-23 08:12:13.680477437 +0000 UTC m=+0.074225124 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:12:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2-userdata-shm.mount: Deactivated successfully. Nov 23 03:12:13 localhost podman[69323]: 2025-11-23 08:12:13.727249081 +0000 UTC m=+0.127883187 container cleanup 046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=setup_ovs_manager, url=https://www.redhat.com) Nov 23 03:12:13 localhost systemd[1]: libpod-conmon-046f5ea250943b9712fc76326f2e99a326a12b1ae8ac9a850d4b62da6b0c0cd2.scope: Deactivated successfully. Nov 23 03:12:13 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1763883761 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1763883761'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Nov 23 03:12:13 localhost podman[69324]: 2025-11-23 08:12:13.839346648 +0000 UTC m=+0.233094315 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:12:13 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:12:14 localhost podman[69460]: 2025-11-23 08:12:14.154910546 +0000 UTC m=+0.063679772 container create 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Nov 23 03:12:14 localhost systemd[1]: Started libpod-conmon-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope. Nov 23 03:12:14 localhost systemd[1]: Started libcrun container. Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost podman[69460]: 2025-11-23 08:12:14.124640643 +0000 UTC m=+0.033409909 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 03:12:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:12:14 localhost podman[69460]: 2025-11-23 08:12:14.25182957 +0000 UTC m=+0.160598826 container init 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 23 03:12:14 localhost podman[69476]: 2025-11-23 08:12:14.279371858 +0000 UTC m=+0.147894467 container create 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, container_name=ovn_metadata_agent) Nov 23 03:12:14 localhost podman[69476]: 2025-11-23 08:12:14.1839429 +0000 UTC m=+0.052465569 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 03:12:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:12:14 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:12:14 localhost podman[69460]: 2025-11-23 08:12:14.289089605 +0000 UTC m=+0.197858861 container start 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:12:14 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 23 03:12:14 localhost systemd[1]: Created slice User Slice of UID 0. Nov 23 03:12:14 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 23 03:12:14 localhost systemd[1]: Started libpod-conmon-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope. Nov 23 03:12:14 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 23 03:12:14 localhost systemd[1]: Starting User Manager for UID 0... Nov 23 03:12:14 localhost systemd[1]: Started libcrun container. Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:14 localhost podman[69499]: 2025-11-23 08:12:14.386205095 +0000 UTC m=+0.091167840 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:12:14 localhost podman[69499]: 2025-11-23 08:12:14.406202415 +0000 UTC m=+0.111165190 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:12:14 localhost podman[69499]: unhealthy Nov 23 03:12:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:12:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:12:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:12:14 localhost podman[69476]: 2025-11-23 08:12:14.479986723 +0000 UTC m=+0.348509402 container init 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=) Nov 23 03:12:14 localhost systemd[69521]: Queued start job for default target Main User Target. Nov 23 03:12:14 localhost systemd[69521]: Created slice User Application Slice. Nov 23 03:12:14 localhost systemd[69521]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 23 03:12:14 localhost systemd[69521]: Started Daily Cleanup of User's Temporary Directories. Nov 23 03:12:14 localhost systemd[69521]: Reached target Paths. Nov 23 03:12:14 localhost systemd[69521]: Reached target Timers. Nov 23 03:12:14 localhost systemd[69521]: Starting D-Bus User Message Bus Socket... Nov 23 03:12:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:12:14 localhost systemd[69521]: Starting Create User's Volatile Files and Directories... Nov 23 03:12:14 localhost systemd[69521]: Finished Create User's Volatile Files and Directories. Nov 23 03:12:14 localhost systemd[69521]: Listening on D-Bus User Message Bus Socket. Nov 23 03:12:14 localhost systemd[69521]: Reached target Sockets. Nov 23 03:12:14 localhost systemd[69521]: Reached target Basic System. Nov 23 03:12:14 localhost systemd[69521]: Reached target Main User Target. Nov 23 03:12:14 localhost systemd[69521]: Startup finished in 155ms. Nov 23 03:12:14 localhost systemd[1]: Started User Manager for UID 0. Nov 23 03:12:14 localhost systemd[1]: Started Session c9 of User root. Nov 23 03:12:14 localhost systemd[1]: session-c9.scope: Deactivated successfully. Nov 23 03:12:14 localhost kernel: device br-int entered promiscuous mode Nov 23 03:12:14 localhost NetworkManager[5975]: [1763885534.6067] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Nov 23 03:12:14 localhost systemd-udevd[69589]: Network interface NamePolicy= disabled on kernel command line. Nov 23 03:12:14 localhost podman[69476]: 2025-11-23 08:12:14.62362208 +0000 UTC m=+0.492144729 container start 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044) Nov 23 03:12:14 localhost python3[68551]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=a43bf0e2ecc9c9d02be7a27eac338b4c --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 03:12:14 localhost systemd[1]: var-lib-containers-storage-overlay-9cd49e7d070c0c00c09eb4d4067ba3757eea8902a6a55fed079db1124e7c8aad-merged.mount: Deactivated successfully. Nov 23 03:12:14 localhost podman[69566]: 2025-11-23 08:12:14.664253999 +0000 UTC m=+0.140524083 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-19T00:14:25Z, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Nov 23 03:12:14 localhost podman[69566]: 2025-11-23 08:12:14.711648934 +0000 UTC m=+0.187919078 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:12:14 localhost podman[69566]: unhealthy Nov 23 03:12:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:12:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:12:15 localhost python3[69643]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:15 localhost kernel: device genev_sys_6081 entered promiscuous mode Nov 23 03:12:15 localhost NetworkManager[5975]: [1763885535.2858] device (genev_sys_6081): carrier: link connected Nov 23 03:12:15 localhost NetworkManager[5975]: [1763885535.2864] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Nov 23 03:12:15 localhost python3[69661]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:15 localhost python3[69677]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:16 localhost python3[69693]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:16 localhost python3[69709]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:16 localhost python3[69728]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:16 localhost python3[69745]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:17 localhost python3[69761]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:17 localhost python3[69779]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:17 localhost python3[69797]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:17 localhost python3[69813]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:18 localhost python3[69829]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:12:18 localhost python3[69890]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:19 localhost python3[69919]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:19 localhost python3[69948]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:20 localhost python3[69977]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:20 localhost python3[70006]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:20 localhost python3[70035]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885538.088235-109596-260958905306558/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:21 localhost python3[70051]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 03:12:21 localhost systemd[1]: Reloading. Nov 23 03:12:21 localhost systemd-rc-local-generator[70076]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:21 localhost systemd-sysv-generator[70080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:22 localhost python3[70104]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:22 localhost systemd[1]: Reloading. Nov 23 03:12:22 localhost systemd-sysv-generator[70134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:22 localhost systemd-rc-local-generator[70130]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:22 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 23 03:12:22 localhost tripleo-start-podman-container[70144]: Creating additional drop-in dependency for "ceilometer_agent_compute" (6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9) Nov 23 03:12:22 localhost systemd[1]: Reloading. Nov 23 03:12:22 localhost systemd-sysv-generator[70207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:22 localhost systemd-rc-local-generator[70203]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:23 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 23 03:12:23 localhost python3[70229]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:23 localhost systemd[1]: Reloading. Nov 23 03:12:24 localhost systemd-rc-local-generator[70254]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:24 localhost systemd-sysv-generator[70261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:24 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Nov 23 03:12:24 localhost systemd[1]: Started ceilometer_agent_ipmi container. Nov 23 03:12:24 localhost systemd[1]: Stopping User Manager for UID 0... Nov 23 03:12:24 localhost systemd[69521]: Activating special unit Exit the Session... Nov 23 03:12:24 localhost systemd[69521]: Stopped target Main User Target. Nov 23 03:12:24 localhost systemd[69521]: Stopped target Basic System. Nov 23 03:12:24 localhost systemd[69521]: Stopped target Paths. Nov 23 03:12:24 localhost systemd[69521]: Stopped target Sockets. Nov 23 03:12:24 localhost systemd[69521]: Stopped target Timers. Nov 23 03:12:24 localhost systemd[69521]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 03:12:24 localhost systemd[69521]: Closed D-Bus User Message Bus Socket. Nov 23 03:12:24 localhost systemd[69521]: Stopped Create User's Volatile Files and Directories. Nov 23 03:12:24 localhost systemd[69521]: Removed slice User Application Slice. Nov 23 03:12:24 localhost systemd[69521]: Reached target Shutdown. Nov 23 03:12:24 localhost systemd[69521]: Finished Exit the Session. Nov 23 03:12:24 localhost systemd[69521]: Reached target Exit the Session. Nov 23 03:12:24 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 23 03:12:24 localhost systemd[1]: Stopped User Manager for UID 0. Nov 23 03:12:24 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 23 03:12:24 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 23 03:12:24 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 23 03:12:24 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 23 03:12:24 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 23 03:12:24 localhost python3[70297]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:26 localhost systemd[1]: Reloading. Nov 23 03:12:26 localhost systemd-rc-local-generator[70327]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:26 localhost systemd-sysv-generator[70332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:26 localhost systemd[1]: Starting logrotate_crond container... Nov 23 03:12:26 localhost systemd[1]: Started logrotate_crond container. Nov 23 03:12:27 localhost python3[70366]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:27 localhost systemd[1]: Reloading. Nov 23 03:12:27 localhost systemd-rc-local-generator[70391]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:27 localhost systemd-sysv-generator[70396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:27 localhost systemd[1]: Starting nova_migration_target container... Nov 23 03:12:27 localhost systemd[1]: Started nova_migration_target container. Nov 23 03:12:28 localhost python3[70432]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:28 localhost systemd[1]: Reloading. Nov 23 03:12:28 localhost systemd-rc-local-generator[70455]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:28 localhost systemd-sysv-generator[70459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:28 localhost systemd[1]: Starting ovn_controller container... Nov 23 03:12:28 localhost tripleo-start-podman-container[70471]: Creating additional drop-in dependency for "ovn_controller" (99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23) Nov 23 03:12:28 localhost systemd[1]: Reloading. Nov 23 03:12:28 localhost systemd-sysv-generator[70535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:28 localhost systemd-rc-local-generator[70531]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:29 localhost systemd[1]: Started ovn_controller container. Nov 23 03:12:29 localhost python3[70556]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:12:29 localhost systemd[1]: Reloading. Nov 23 03:12:29 localhost systemd-sysv-generator[70586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:12:29 localhost systemd-rc-local-generator[70582]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:12:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:12:30 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 23 03:12:30 localhost systemd[1]: Started ovn_metadata_agent container. Nov 23 03:12:30 localhost python3[70637]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:31 localhost python3[70758]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005532585 step=4 update_config_hash_only=False Nov 23 03:12:32 localhost python3[70774]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:12:32 localhost python3[70791]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 03:12:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:12:37 localhost podman[70792]: 2025-11-23 08:12:37.024823496 +0000 UTC m=+0.077413121 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:12:37 localhost podman[70792]: 2025-11-23 08:12:37.037394359 +0000 UTC m=+0.089983994 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 23 03:12:37 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:12:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:12:38 localhost podman[70812]: 2025-11-23 08:12:38.003016719 +0000 UTC m=+0.067515319 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3) Nov 23 03:12:38 localhost podman[70812]: 2025-11-23 08:12:38.039389927 +0000 UTC m=+0.103888467 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:12:38 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:12:41 localhost systemd[1]: tmp-crun.PKd3dw.mount: Deactivated successfully. Nov 23 03:12:41 localhost podman[70835]: 2025-11-23 08:12:41.014922313 +0000 UTC m=+0.067978703 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 23 03:12:41 localhost systemd[1]: tmp-crun.lm4kOZ.mount: Deactivated successfully. Nov 23 03:12:41 localhost podman[70834]: 2025-11-23 08:12:41.061654187 +0000 UTC m=+0.113706127 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Nov 23 03:12:41 localhost podman[70833]: 2025-11-23 08:12:41.061819322 +0000 UTC m=+0.118055939 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 23 03:12:41 localhost podman[70834]: 2025-11-23 08:12:41.121521101 +0000 UTC m=+0.173573091 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Nov 23 03:12:41 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:12:41 localhost podman[70833]: 2025-11-23 08:12:41.146356318 +0000 UTC m=+0.202592905 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64) Nov 23 03:12:41 localhost podman[70835]: 2025-11-23 08:12:41.144579874 +0000 UTC m=+0.197636294 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:12:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:12:41 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:12:41 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:12:41 localhost podman[70902]: 2025-11-23 08:12:41.241817198 +0000 UTC m=+0.075117511 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:12:41 localhost podman[70902]: 2025-11-23 08:12:41.616515058 +0000 UTC m=+0.449815391 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4) Nov 23 03:12:41 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:12:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:12:44 localhost systemd[1]: tmp-crun.zzxHOY.mount: Deactivated successfully. Nov 23 03:12:44 localhost podman[70924]: 2025-11-23 08:12:44.04629369 +0000 UTC m=+0.099628737 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1761123044) Nov 23 03:12:44 localhost podman[70924]: 2025-11-23 08:12:44.256355242 +0000 UTC m=+0.309690199 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible) Nov 23 03:12:44 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:12:45 localhost systemd[1]: tmp-crun.v8Rvml.mount: Deactivated successfully. Nov 23 03:12:45 localhost podman[70952]: 2025-11-23 08:12:45.030547127 +0000 UTC m=+0.088313892 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:12:45 localhost systemd[1]: tmp-crun.Lz0prH.mount: Deactivated successfully. Nov 23 03:12:45 localhost podman[70953]: 2025-11-23 08:12:45.08314326 +0000 UTC m=+0.138788090 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git) Nov 23 03:12:45 localhost podman[70952]: 2025-11-23 08:12:45.094381313 +0000 UTC m=+0.152148048 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:12:45 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:12:45 localhost podman[70953]: 2025-11-23 08:12:45.109561326 +0000 UTC m=+0.165206186 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:12:45 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:12:54 localhost snmpd[67457]: empty variable list in _query Nov 23 03:12:54 localhost snmpd[67457]: empty variable list in _query Nov 23 03:12:57 localhost podman[71187]: Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.439059184 +0000 UTC m=+0.075650636 container create f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vendor=Red Hat, Inc., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Nov 23 03:12:57 localhost systemd[1]: Started libpod-conmon-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope. Nov 23 03:12:57 localhost systemd[1]: Started libcrun container. Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.407608836 +0000 UTC m=+0.044200328 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.548314193 +0000 UTC m=+0.184905645 container init f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, io.openshift.expose-services=, ceph=True, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc.) Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.55869897 +0000 UTC m=+0.195290432 container start f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55) Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.558981928 +0000 UTC m=+0.195573440 container attach f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, io.buildah.version=1.33.12, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7) Nov 23 03:12:57 localhost elastic_bhaskara[71202]: 167 167 Nov 23 03:12:57 localhost systemd[1]: libpod-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope: Deactivated successfully. Nov 23 03:12:57 localhost podman[71187]: 2025-11-23 08:12:57.564476446 +0000 UTC m=+0.201067948 container died f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, maintainer=Guillaume Abrioux , ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, version=7) Nov 23 03:12:57 localhost podman[71207]: 2025-11-23 08:12:57.658655067 +0000 UTC m=+0.084313621 container remove f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bhaskara, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, RELEASE=main) Nov 23 03:12:57 localhost systemd[1]: libpod-conmon-f8f09b7934b3e6f44d11d973edd696203e4191454a35423acfcf43a59a2cff75.scope: Deactivated successfully. Nov 23 03:12:57 localhost podman[71229]: Nov 23 03:12:57 localhost podman[71229]: 2025-11-23 08:12:57.864945264 +0000 UTC m=+0.081566177 container create c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main) Nov 23 03:12:57 localhost systemd[1]: Started libpod-conmon-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope. Nov 23 03:12:57 localhost systemd[1]: Started libcrun container. Nov 23 03:12:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 03:12:57 localhost podman[71229]: 2025-11-23 08:12:57.831473394 +0000 UTC m=+0.048094307 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 03:12:57 localhost podman[71229]: 2025-11-23 08:12:57.931197924 +0000 UTC m=+0.147818847 container init c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.expose-services=, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True) Nov 23 03:12:57 localhost podman[71229]: 2025-11-23 08:12:57.938053162 +0000 UTC m=+0.154674075 container start c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 03:12:57 localhost podman[71229]: 2025-11-23 08:12:57.938252708 +0000 UTC m=+0.154873631 container attach c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-type=git) Nov 23 03:12:58 localhost systemd[1]: var-lib-containers-storage-overlay-f5c66cdc19ec9d720046bbff2a6ffc9a66c1f02b6abc0c7fcab935f67be64f1a-merged.mount: Deactivated successfully. Nov 23 03:12:58 localhost boring_mcnulty[71245]: [ Nov 23 03:12:58 localhost boring_mcnulty[71245]: { Nov 23 03:12:58 localhost boring_mcnulty[71245]: "available": false, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "ceph_device": false, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "lsm_data": {}, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "lvs": [], Nov 23 03:12:58 localhost boring_mcnulty[71245]: "path": "/dev/sr0", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "rejected_reasons": [ Nov 23 03:12:58 localhost boring_mcnulty[71245]: "Has a FileSystem", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "Insufficient space (<5GB)" Nov 23 03:12:58 localhost boring_mcnulty[71245]: ], Nov 23 03:12:58 localhost boring_mcnulty[71245]: "sys_api": { Nov 23 03:12:58 localhost boring_mcnulty[71245]: "actuators": null, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "device_nodes": "sr0", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "human_readable_size": "482.00 KB", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "id_bus": "ata", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "model": "QEMU DVD-ROM", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "nr_requests": "2", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "partitions": {}, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "path": "/dev/sr0", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "removable": "1", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "rev": "2.5+", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "ro": "0", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "rotational": "1", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "sas_address": "", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "sas_device_handle": "", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "scheduler_mode": "mq-deadline", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "sectors": 0, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "sectorsize": "2048", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "size": 493568.0, Nov 23 03:12:58 localhost boring_mcnulty[71245]: "support_discard": "0", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "type": "disk", Nov 23 03:12:58 localhost boring_mcnulty[71245]: "vendor": "QEMU" Nov 23 03:12:58 localhost boring_mcnulty[71245]: } Nov 23 03:12:58 localhost boring_mcnulty[71245]: } Nov 23 03:12:58 localhost boring_mcnulty[71245]: ] Nov 23 03:12:58 localhost systemd[1]: libpod-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope: Deactivated successfully. Nov 23 03:12:58 localhost podman[71229]: 2025-11-23 08:12:58.86980309 +0000 UTC m=+1.086424053 container died c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, release=553, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container) Nov 23 03:12:58 localhost systemd[1]: var-lib-containers-storage-overlay-c085a9a9c0aff6988c158f605142398529d4fed48e4e5ddfd851d5d755c306dd-merged.mount: Deactivated successfully. Nov 23 03:12:58 localhost podman[73262]: 2025-11-23 08:12:58.944146425 +0000 UTC m=+0.068972503 container remove c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mcnulty, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux , release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc.) Nov 23 03:12:58 localhost systemd[1]: libpod-conmon-c8e3cd10dd1ce476b3a9b862d3110dd3082591720868961c6d7fc78466afef9c.scope: Deactivated successfully. Nov 23 03:13:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:13:08 localhost podman[73292]: 2025-11-23 08:13:08.037149235 +0000 UTC m=+0.088617542 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, architecture=x86_64, container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:13:08 localhost podman[73292]: 2025-11-23 08:13:08.072825172 +0000 UTC m=+0.124293479 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Nov 23 03:13:08 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:13:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:13:08 localhost systemd[1]: tmp-crun.EI3ST2.mount: Deactivated successfully. Nov 23 03:13:08 localhost podman[73310]: 2025-11-23 08:13:08.202064821 +0000 UTC m=+0.088044575 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, version=17.1.12, architecture=x86_64, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:13:08 localhost podman[73310]: 2025-11-23 08:13:08.23716244 +0000 UTC m=+0.123142164 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, container_name=iscsid, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1) Nov 23 03:13:08 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:13:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:13:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:13:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:13:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:13:12 localhost podman[73330]: 2025-11-23 08:13:12.038243945 +0000 UTC m=+0.089796708 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Nov 23 03:13:12 localhost podman[73329]: 2025-11-23 08:13:12.092796998 +0000 UTC m=+0.144947479 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4) Nov 23 03:13:12 localhost podman[73329]: 2025-11-23 08:13:12.105253657 +0000 UTC m=+0.157404078 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:13:12 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:13:12 localhost podman[73331]: 2025-11-23 08:13:12.143209474 +0000 UTC m=+0.189790075 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4) Nov 23 03:13:12 localhost podman[73335]: 2025-11-23 08:13:12.200999925 +0000 UTC m=+0.245185503 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4) Nov 23 03:13:12 localhost podman[73330]: 2025-11-23 08:13:12.217799817 +0000 UTC m=+0.269352580 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Nov 23 03:13:12 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:13:12 localhost podman[73331]: 2025-11-23 08:13:12.272537996 +0000 UTC m=+0.319118527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:13:12 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:13:12 localhost podman[73335]: 2025-11-23 08:13:12.58735221 +0000 UTC m=+0.631537758 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute) Nov 23 03:13:12 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:13:15 localhost podman[73422]: 2025-11-23 08:13:15.031584374 +0000 UTC m=+0.083546387 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr) Nov 23 03:13:15 localhost podman[73422]: 2025-11-23 08:13:15.235273132 +0000 UTC m=+0.287235085 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:13:15 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:13:15 localhost podman[73452]: 2025-11-23 08:13:15.345962415 +0000 UTC m=+0.077638917 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=tripleo_ansible) Nov 23 03:13:15 localhost podman[73453]: 2025-11-23 08:13:15.32511949 +0000 UTC m=+0.056880275 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:13:15 localhost podman[73453]: 2025-11-23 08:13:15.409491262 +0000 UTC m=+0.141252107 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64) Nov 23 03:13:15 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:13:15 localhost podman[73452]: 2025-11-23 08:13:15.423413466 +0000 UTC m=+0.155090018 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1761123044, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public) Nov 23 03:13:15 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:13:16 localhost systemd[1]: tmp-crun.mE9koz.mount: Deactivated successfully. Nov 23 03:13:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:13:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:13:39 localhost systemd[1]: tmp-crun.jPlrRA.mount: Deactivated successfully. Nov 23 03:13:39 localhost podman[73499]: 2025-11-23 08:13:39.012789413 +0000 UTC m=+0.075388909 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, release=1761123044, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4) Nov 23 03:13:39 localhost systemd[1]: tmp-crun.5u5sJN.mount: Deactivated successfully. Nov 23 03:13:39 localhost podman[73500]: 2025-11-23 08:13:39.061270451 +0000 UTC m=+0.121581837 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 23 03:13:39 localhost podman[73500]: 2025-11-23 08:13:39.067440819 +0000 UTC m=+0.127752205 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:13:39 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:13:39 localhost podman[73499]: 2025-11-23 08:13:39.079795865 +0000 UTC m=+0.142395381 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:13:39 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:13:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:13:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:13:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:13:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:13:43 localhost podman[73538]: 2025-11-23 08:13:43.008643906 +0000 UTC m=+0.066386125 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.openshift.expose-services=) Nov 23 03:13:43 localhost podman[73538]: 2025-11-23 08:13:43.044446686 +0000 UTC m=+0.102188975 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, release=1761123044) Nov 23 03:13:43 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:13:43 localhost podman[73539]: 2025-11-23 08:13:43.063239059 +0000 UTC m=+0.117382328 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, architecture=x86_64, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:13:43 localhost systemd[1]: tmp-crun.9SS82T.mount: Deactivated successfully. Nov 23 03:13:43 localhost podman[73540]: 2025-11-23 08:13:43.123666121 +0000 UTC m=+0.174537131 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:13:43 localhost podman[73540]: 2025-11-23 08:13:43.152288093 +0000 UTC m=+0.203159073 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:13:43 localhost podman[73546]: 2025-11-23 08:13:43.170533169 +0000 UTC m=+0.217427717 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044) Nov 23 03:13:43 localhost podman[73539]: 2025-11-23 08:13:43.193710956 +0000 UTC m=+0.247854285 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:13:43 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:13:43 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:13:43 localhost podman[73546]: 2025-11-23 08:13:43.514157292 +0000 UTC m=+0.561051810 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, distribution-scope=public, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:13:43 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:13:46 localhost systemd[1]: tmp-crun.VRNBhO.mount: Deactivated successfully. Nov 23 03:13:46 localhost podman[73630]: 2025-11-23 08:13:46.028146351 +0000 UTC m=+0.084623700 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.) Nov 23 03:13:46 localhost systemd[1]: tmp-crun.9leR8K.mount: Deactivated successfully. Nov 23 03:13:46 localhost podman[73629]: 2025-11-23 08:13:46.081276381 +0000 UTC m=+0.140481973 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:13:46 localhost podman[73631]: 2025-11-23 08:13:46.130153821 +0000 UTC m=+0.183303988 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, version=17.1.12, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:13:46 localhost podman[73630]: 2025-11-23 08:13:46.158647829 +0000 UTC m=+0.215125167 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:13:46 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:13:46 localhost podman[73631]: 2025-11-23 08:13:46.213546373 +0000 UTC m=+0.266696600 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Nov 23 03:13:46 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:13:46 localhost podman[73629]: 2025-11-23 08:13:46.273288953 +0000 UTC m=+0.332494605 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 23 03:13:46 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:14:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:14:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:14:10 localhost podman[73780]: 2025-11-23 08:14:10.029662266 +0000 UTC m=+0.087982513 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git) Nov 23 03:14:10 localhost podman[73780]: 2025-11-23 08:14:10.039104073 +0000 UTC m=+0.097424280 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd) Nov 23 03:14:10 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:14:10 localhost systemd[1]: tmp-crun.pZxztP.mount: Deactivated successfully. Nov 23 03:14:10 localhost podman[73781]: 2025-11-23 08:14:10.095035862 +0000 UTC m=+0.149918645 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-iscsid, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:14:10 localhost podman[73781]: 2025-11-23 08:14:10.106142309 +0000 UTC m=+0.161025032 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 23 03:14:10 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:14:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:14:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:14:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:14:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:14:14 localhost systemd[1]: tmp-crun.imXMqm.mount: Deactivated successfully. Nov 23 03:14:14 localhost podman[73824]: 2025-11-23 08:14:14.05308901 +0000 UTC m=+0.103158484 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:14:14 localhost podman[73820]: 2025-11-23 08:14:14.012770696 +0000 UTC m=+0.069841773 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-type=git) Nov 23 03:14:14 localhost podman[73820]: 2025-11-23 08:14:14.095135547 +0000 UTC m=+0.152206584 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible) Nov 23 03:14:14 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:14:14 localhost podman[73821]: 2025-11-23 08:14:14.142322821 +0000 UTC m=+0.194308433 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z) Nov 23 03:14:14 localhost podman[73822]: 2025-11-23 08:14:14.18607091 +0000 UTC m=+0.236624319 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Nov 23 03:14:14 localhost podman[73821]: 2025-11-23 08:14:14.201751355 +0000 UTC m=+0.253736957 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Nov 23 03:14:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:14:14 localhost podman[73822]: 2025-11-23 08:14:14.241016058 +0000 UTC m=+0.291569447 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:14:14 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:14:14 localhost podman[73824]: 2025-11-23 08:14:14.407678021 +0000 UTC m=+0.457747495 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible) Nov 23 03:14:14 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:14:17 localhost systemd[1]: tmp-crun.w1lMZb.mount: Deactivated successfully. Nov 23 03:14:17 localhost podman[73913]: 2025-11-23 08:14:17.025971005 +0000 UTC m=+0.085108877 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 23 03:14:17 localhost podman[73915]: 2025-11-23 08:14:17.070737194 +0000 UTC m=+0.124284576 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:14:17 localhost podman[73915]: 2025-11-23 08:14:17.091130224 +0000 UTC m=+0.144677616 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, release=1761123044, tcib_managed=true, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team) Nov 23 03:14:17 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:14:17 localhost podman[73914]: 2025-11-23 08:14:17.175193977 +0000 UTC m=+0.230984728 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.12) Nov 23 03:14:17 localhost podman[73914]: 2025-11-23 08:14:17.218336817 +0000 UTC m=+0.274127568 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:14:17 localhost podman[73913]: 2025-11-23 08:14:17.229231918 +0000 UTC m=+0.288369770 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 23 03:14:17 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:14:17 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:14:18 localhost systemd[1]: tmp-crun.HvtjGL.mount: Deactivated successfully. Nov 23 03:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:14:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:14:41 localhost podman[73987]: 2025-11-23 08:14:41.025924192 +0000 UTC m=+0.077760013 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 23 03:14:41 localhost podman[73987]: 2025-11-23 08:14:41.03605793 +0000 UTC m=+0.087893741 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:14:41 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:14:41 localhost systemd[1]: tmp-crun.vnfKNZ.mount: Deactivated successfully. Nov 23 03:14:41 localhost podman[73986]: 2025-11-23 08:14:41.084004717 +0000 UTC m=+0.139033515 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z) Nov 23 03:14:41 localhost podman[73986]: 2025-11-23 08:14:41.09924836 +0000 UTC m=+0.154277158 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team) Nov 23 03:14:41 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:14:45 localhost systemd[1]: tmp-crun.H5U0qI.mount: Deactivated successfully. Nov 23 03:14:45 localhost podman[74028]: 2025-11-23 08:14:45.039539498 +0000 UTC m=+0.090936122 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:14:45 localhost systemd[1]: tmp-crun.qjPP1y.mount: Deactivated successfully. Nov 23 03:14:45 localhost podman[74026]: 2025-11-23 08:14:45.077702548 +0000 UTC m=+0.136165637 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:14:45 localhost podman[74028]: 2025-11-23 08:14:45.096280293 +0000 UTC m=+0.147676947 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 23 03:14:45 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:14:45 localhost podman[74026]: 2025-11-23 08:14:45.114415823 +0000 UTC m=+0.172878902 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:14:45 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:14:45 localhost podman[74027]: 2025-11-23 08:14:45.183178612 +0000 UTC m=+0.239460225 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:14:45 localhost podman[74030]: 2025-11-23 08:14:45.229354024 +0000 UTC m=+0.278257623 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:14:45 localhost podman[74027]: 2025-11-23 08:14:45.237198553 +0000 UTC m=+0.293480226 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Nov 23 03:14:45 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:14:45 localhost podman[74030]: 2025-11-23 08:14:45.617885777 +0000 UTC m=+0.666789376 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Nov 23 03:14:45 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:14:48 localhost systemd[1]: tmp-crun.tCQO8K.mount: Deactivated successfully. Nov 23 03:14:48 localhost podman[74118]: 2025-11-23 08:14:48.053673684 +0000 UTC m=+0.110383753 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Nov 23 03:14:48 localhost podman[74119]: 2025-11-23 08:14:48.003864482 +0000 UTC m=+0.057886539 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64) Nov 23 03:14:48 localhost podman[74117]: 2025-11-23 08:14:48.032413359 +0000 UTC m=+0.088360145 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, version=17.1.12) Nov 23 03:14:48 localhost podman[74118]: 2025-11-23 08:14:48.089591215 +0000 UTC m=+0.146301284 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:14:48 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:14:48 localhost podman[74119]: 2025-11-23 08:14:48.138218043 +0000 UTC m=+0.192240100 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64) Nov 23 03:14:48 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:14:48 localhost podman[74117]: 2025-11-23 08:14:48.215027276 +0000 UTC m=+0.270974152 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 23 03:14:48 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:15:02 localhost podman[74287]: 2025-11-23 08:15:02.733780393 +0000 UTC m=+0.086467227 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 03:15:02 localhost podman[74287]: 2025-11-23 08:15:02.845318902 +0000 UTC m=+0.198005716 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Nov 23 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:15:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:15:12 localhost podman[74433]: 2025-11-23 08:15:12.040236714 +0000 UTC m=+0.092667836 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:15:12 localhost podman[74433]: 2025-11-23 08:15:12.075739532 +0000 UTC m=+0.128170644 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:15:12 localhost podman[74432]: 2025-11-23 08:15:12.085175679 +0000 UTC m=+0.137120786 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:15:12 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:15:12 localhost podman[74432]: 2025-11-23 08:15:12.096255326 +0000 UTC m=+0.148200423 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:15:12 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:15:15 localhost systemd[1]: tmp-crun.PvSLM6.mount: Deactivated successfully. Nov 23 03:15:15 localhost podman[74517]: 2025-11-23 08:15:15.480239527 +0000 UTC m=+0.096176223 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:15:15 localhost podman[74517]: 2025-11-23 08:15:15.518136118 +0000 UTC m=+0.134072834 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:15:15 localhost podman[74518]: 2025-11-23 08:15:15.532463403 +0000 UTC m=+0.144077307 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:15:15 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:15:15 localhost python3[74515]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:15 localhost podman[74516]: 2025-11-23 08:15:15.580620166 +0000 UTC m=+0.196561772 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.expose-services=) Nov 23 03:15:15 localhost podman[74516]: 2025-11-23 08:15:15.587875736 +0000 UTC m=+0.203817302 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 23 03:15:15 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:15:15 localhost podman[74518]: 2025-11-23 08:15:15.642616869 +0000 UTC m=+0.254230783 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, architecture=x86_64) Nov 23 03:15:15 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:15:15 localhost podman[74611]: 2025-11-23 08:15:15.745752832 +0000 UTC m=+0.066425859 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:15:15 localhost python3[74649]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885715.2035592-113720-120016200137482/source _original_basename=tmpdq5n753y follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:16 localhost podman[74611]: 2025-11-23 08:15:16.143443583 +0000 UTC m=+0.464116640 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:15:16 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:15:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:15:16 localhost recover_tripleo_nova_virtqemud[74668]: 61756 Nov 23 03:15:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:15:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:15:16 localhost python3[74684]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:15:18 localhost podman[74858]: 2025-11-23 08:15:18.274964778 +0000 UTC m=+0.078296839 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 23 03:15:18 localhost podman[74857]: 2025-11-23 08:15:18.256708914 +0000 UTC m=+0.066538362 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:15:18 localhost podman[74893]: 2025-11-23 08:15:18.339394295 +0000 UTC m=+0.058915830 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:15:18 localhost podman[74857]: 2025-11-23 08:15:18.340225231 +0000 UTC m=+0.150054679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12) Nov 23 03:15:18 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:15:18 localhost podman[74858]: 2025-11-23 08:15:18.367992815 +0000 UTC m=+0.171324866 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:15:18 localhost ansible-async_wrapper.py[74856]: Invoked with 721452772396 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885717.8290527-113848-201341338781547/AnsiballZ_command.py _ Nov 23 03:15:18 localhost ansible-async_wrapper.py[74935]: Starting module and watcher Nov 23 03:15:18 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:15:18 localhost ansible-async_wrapper.py[74935]: Start watching 74936 (3600) Nov 23 03:15:18 localhost ansible-async_wrapper.py[74936]: Start module (74936) Nov 23 03:15:18 localhost ansible-async_wrapper.py[74856]: Return async_wrapper task started. Nov 23 03:15:18 localhost podman[74893]: 2025-11-23 08:15:18.543189466 +0000 UTC m=+0.262710991 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git) Nov 23 03:15:18 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:15:18 localhost python3[74956]: ansible-ansible.legacy.async_status Invoked with jid=721452772396.74856 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:15:21 localhost puppet-user[74948]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 23 03:15:21 localhost puppet-user[74948]: (file: /etc/puppet/hiera.yaml) Nov 23 03:15:21 localhost puppet-user[74948]: Warning: Undefined variable '::deploy_config_name'; Nov 23 03:15:21 localhost puppet-user[74948]: (file & line not available) Nov 23 03:15:21 localhost puppet-user[74948]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 23 03:15:21 localhost puppet-user[74948]: (file & line not available) Nov 23 03:15:21 localhost puppet-user[74948]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:15:22 localhost puppet-user[74948]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:15:22 localhost puppet-user[74948]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:15:22 localhost puppet-user[74948]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:15:22 localhost puppet-user[74948]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 23 03:15:22 localhost puppet-user[74948]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 23 03:15:22 localhost puppet-user[74948]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 23 03:15:22 localhost puppet-user[74948]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 23 03:15:22 localhost puppet-user[74948]: Notice: Compiled catalog for np0005532585.localdomain in environment production in 0.22 seconds Nov 23 03:15:22 localhost puppet-user[74948]: Notice: Applied catalog in 0.22 seconds Nov 23 03:15:22 localhost puppet-user[74948]: Application: Nov 23 03:15:22 localhost puppet-user[74948]: Initial environment: production Nov 23 03:15:22 localhost puppet-user[74948]: Converged environment: production Nov 23 03:15:22 localhost puppet-user[74948]: Run mode: user Nov 23 03:15:22 localhost puppet-user[74948]: Changes: Nov 23 03:15:22 localhost puppet-user[74948]: Events: Nov 23 03:15:22 localhost puppet-user[74948]: Resources: Nov 23 03:15:22 localhost puppet-user[74948]: Total: 19 Nov 23 03:15:22 localhost puppet-user[74948]: Time: Nov 23 03:15:22 localhost puppet-user[74948]: Schedule: 0.00 Nov 23 03:15:22 localhost puppet-user[74948]: Package: 0.00 Nov 23 03:15:22 localhost puppet-user[74948]: Exec: 0.00 Nov 23 03:15:22 localhost puppet-user[74948]: Augeas: 0.01 Nov 23 03:15:22 localhost puppet-user[74948]: File: 0.02 Nov 23 03:15:22 localhost puppet-user[74948]: Service: 0.04 Nov 23 03:15:22 localhost puppet-user[74948]: Transaction evaluation: 0.21 Nov 23 03:15:22 localhost puppet-user[74948]: Catalog application: 0.22 Nov 23 03:15:22 localhost puppet-user[74948]: Config retrieval: 0.29 Nov 23 03:15:22 localhost puppet-user[74948]: Last run: 1763885722 Nov 23 03:15:22 localhost puppet-user[74948]: Filebucket: 0.00 Nov 23 03:15:22 localhost puppet-user[74948]: Total: 0.22 Nov 23 03:15:22 localhost puppet-user[74948]: Version: Nov 23 03:15:22 localhost puppet-user[74948]: Config: 1763885721 Nov 23 03:15:22 localhost puppet-user[74948]: Puppet: 7.10.0 Nov 23 03:15:22 localhost ansible-async_wrapper.py[74936]: Module complete (74936) Nov 23 03:15:23 localhost ansible-async_wrapper.py[74935]: Done in kid B. Nov 23 03:15:29 localhost python3[75094]: ansible-ansible.legacy.async_status Invoked with jid=721452772396.74856 mode=status _async_dir=/tmp/.ansible_async Nov 23 03:15:29 localhost python3[75110]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:15:30 localhost python3[75126]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:15:30 localhost python3[75176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:30 localhost python3[75194]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp98se98qh recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 23 03:15:31 localhost python3[75224]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:32 localhost python3[75329]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 23 03:15:33 localhost python3[75348]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:33 localhost python3[75380]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:15:34 localhost python3[75430]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:34 localhost python3[75448]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:35 localhost python3[75510]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:35 localhost python3[75528]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:36 localhost python3[75590]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:36 localhost python3[75608]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:37 localhost python3[75670]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:37 localhost python3[75688]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:37 localhost python3[75718]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:15:37 localhost systemd[1]: Reloading. Nov 23 03:15:37 localhost systemd-rc-local-generator[75738]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:15:37 localhost systemd-sysv-generator[75742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:15:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:15:39 localhost python3[75804]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:39 localhost python3[75822]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:40 localhost python3[75884]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 23 03:15:40 localhost python3[75902]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:15:41 localhost python3[75932]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:15:41 localhost systemd[1]: Reloading. Nov 23 03:15:41 localhost systemd-sysv-generator[75959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:15:41 localhost systemd-rc-local-generator[75954]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:15:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:15:41 localhost systemd[1]: Starting Create netns directory... Nov 23 03:15:41 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 03:15:41 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 03:15:41 localhost systemd[1]: Finished Create netns directory. Nov 23 03:15:41 localhost python3[75989]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 23 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:15:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:15:42 localhost podman[76006]: 2025-11-23 08:15:42.49005658 +0000 UTC m=+0.077626638 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12) Nov 23 03:15:42 localhost podman[76006]: 2025-11-23 08:15:42.506213991 +0000 UTC m=+0.093784069 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, release=1761123044, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible) Nov 23 03:15:42 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:15:42 localhost systemd[1]: tmp-crun.62CGE8.mount: Deactivated successfully. Nov 23 03:15:42 localhost podman[76007]: 2025-11-23 08:15:42.552944691 +0000 UTC m=+0.138594911 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 23 03:15:42 localhost podman[76007]: 2025-11-23 08:15:42.563208822 +0000 UTC m=+0.148859042 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12) Nov 23 03:15:42 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:15:43 localhost python3[76083]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 23 03:15:44 localhost podman[76122]: 2025-11-23 08:15:44.105774019 +0000 UTC m=+0.089074167 container create e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044) Nov 23 03:15:44 localhost systemd[1]: Started libpod-conmon-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope. Nov 23 03:15:44 localhost podman[76122]: 2025-11-23 08:15:44.058983227 +0000 UTC m=+0.042283405 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:15:44 localhost systemd[1]: Started libcrun container. Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:15:44 localhost podman[76122]: 2025-11-23 08:15:44.204123656 +0000 UTC m=+0.187423834 container init e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git) Nov 23 03:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:15:44 localhost podman[76122]: 2025-11-23 08:15:44.237859991 +0000 UTC m=+0.221160149 container start e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 23 03:15:44 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:15:44 localhost python3[76083]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:15:44 localhost systemd[1]: Created slice User Slice of UID 0. Nov 23 03:15:44 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 23 03:15:44 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 23 03:15:44 localhost systemd[1]: Starting User Manager for UID 0... Nov 23 03:15:44 localhost podman[76144]: 2025-11-23 08:15:44.334229049 +0000 UTC m=+0.086181259 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc.) Nov 23 03:15:44 localhost podman[76144]: 2025-11-23 08:15:44.374215093 +0000 UTC m=+0.126167293 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git) Nov 23 03:15:44 localhost podman[76144]: unhealthy Nov 23 03:15:44 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:15:44 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:15:44 localhost systemd[76168]: Queued start job for default target Main User Target. Nov 23 03:15:44 localhost systemd[76168]: Created slice User Application Slice. Nov 23 03:15:44 localhost systemd[76168]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 23 03:15:44 localhost systemd[76168]: Started Daily Cleanup of User's Temporary Directories. Nov 23 03:15:44 localhost systemd[76168]: Reached target Paths. Nov 23 03:15:44 localhost systemd[76168]: Reached target Timers. Nov 23 03:15:44 localhost systemd[76168]: Starting D-Bus User Message Bus Socket... Nov 23 03:15:44 localhost systemd[76168]: Starting Create User's Volatile Files and Directories... Nov 23 03:15:44 localhost systemd[76168]: Finished Create User's Volatile Files and Directories. Nov 23 03:15:44 localhost systemd[76168]: Listening on D-Bus User Message Bus Socket. Nov 23 03:15:44 localhost systemd[76168]: Reached target Sockets. Nov 23 03:15:44 localhost systemd[76168]: Reached target Basic System. Nov 23 03:15:44 localhost systemd[76168]: Reached target Main User Target. Nov 23 03:15:44 localhost systemd[76168]: Startup finished in 133ms. Nov 23 03:15:44 localhost systemd[1]: Started User Manager for UID 0. Nov 23 03:15:44 localhost systemd[1]: Started Session c10 of User root. Nov 23 03:15:44 localhost systemd[1]: session-c10.scope: Deactivated successfully. Nov 23 03:15:44 localhost podman[76245]: 2025-11-23 08:15:44.67172578 +0000 UTC m=+0.070697608 container create 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_wait_for_compute_service) Nov 23 03:15:44 localhost systemd[1]: Started libpod-conmon-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope. Nov 23 03:15:44 localhost podman[76245]: 2025-11-23 08:15:44.629786146 +0000 UTC m=+0.028757954 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:15:44 localhost systemd[1]: Started libcrun container. Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 23 03:15:44 localhost podman[76245]: 2025-11-23 08:15:44.748773471 +0000 UTC m=+0.147745309 container init 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team) Nov 23 03:15:44 localhost podman[76245]: 2025-11-23 08:15:44.759536327 +0000 UTC m=+0.158508155 container start 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team) Nov 23 03:15:44 localhost podman[76245]: 2025-11-23 08:15:44.759823287 +0000 UTC m=+0.158795155 container attach 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true) Nov 23 03:15:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:15:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:15:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:15:46 localhost systemd[1]: tmp-crun.b5wln3.mount: Deactivated successfully. Nov 23 03:15:46 localhost podman[76269]: 2025-11-23 08:15:46.034230367 +0000 UTC m=+0.088550610 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:15:46 localhost podman[76268]: 2025-11-23 08:15:46.088561608 +0000 UTC m=+0.142921833 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Nov 23 03:15:46 localhost podman[76268]: 2025-11-23 08:15:46.095063525 +0000 UTC m=+0.149423690 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 23 03:15:46 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:15:46 localhost systemd[1]: tmp-crun.14FYvA.mount: Deactivated successfully. Nov 23 03:15:46 localhost podman[76270]: 2025-11-23 08:15:46.125300514 +0000 UTC m=+0.177006428 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:15:46 localhost podman[76269]: 2025-11-23 08:15:46.140745782 +0000 UTC m=+0.195066015 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute) Nov 23 03:15:46 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:15:46 localhost podman[76270]: 2025-11-23 08:15:46.175280512 +0000 UTC m=+0.226986446 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:15:46 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:15:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:15:46 localhost podman[76337]: 2025-11-23 08:15:46.254432206 +0000 UTC m=+0.060203879 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64) Nov 23 03:15:46 localhost podman[76337]: 2025-11-23 08:15:46.64631157 +0000 UTC m=+0.452083273 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:15:46 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:15:49 localhost systemd[1]: tmp-crun.YPPUfr.mount: Deactivated successfully. Nov 23 03:15:49 localhost podman[76360]: 2025-11-23 08:15:49.024084677 +0000 UTC m=+0.084360094 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr) Nov 23 03:15:49 localhost systemd[1]: tmp-crun.wlHVKd.mount: Deactivated successfully. Nov 23 03:15:49 localhost podman[76361]: 2025-11-23 08:15:49.083432829 +0000 UTC m=+0.139139668 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:15:49 localhost podman[76362]: 2025-11-23 08:15:49.125032983 +0000 UTC m=+0.176758170 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 23 03:15:49 localhost podman[76361]: 2025-11-23 08:15:49.140187963 +0000 UTC m=+0.195894812 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 23 03:15:49 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:15:49 localhost podman[76362]: 2025-11-23 08:15:49.180659333 +0000 UTC m=+0.232384540 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 23 03:15:49 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:15:49 localhost podman[76360]: 2025-11-23 08:15:49.232638741 +0000 UTC m=+0.292914118 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044) Nov 23 03:15:49 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:15:54 localhost systemd[1]: Stopping User Manager for UID 0... Nov 23 03:15:54 localhost systemd[76168]: Activating special unit Exit the Session... Nov 23 03:15:54 localhost systemd[76168]: Stopped target Main User Target. Nov 23 03:15:54 localhost systemd[76168]: Stopped target Basic System. Nov 23 03:15:54 localhost systemd[76168]: Stopped target Paths. Nov 23 03:15:54 localhost systemd[76168]: Stopped target Sockets. Nov 23 03:15:54 localhost systemd[76168]: Stopped target Timers. Nov 23 03:15:54 localhost systemd[76168]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 03:15:54 localhost systemd[76168]: Closed D-Bus User Message Bus Socket. Nov 23 03:15:54 localhost systemd[76168]: Stopped Create User's Volatile Files and Directories. Nov 23 03:15:54 localhost systemd[76168]: Removed slice User Application Slice. Nov 23 03:15:54 localhost systemd[76168]: Reached target Shutdown. Nov 23 03:15:54 localhost systemd[76168]: Finished Exit the Session. Nov 23 03:15:54 localhost systemd[76168]: Reached target Exit the Session. Nov 23 03:15:54 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 23 03:15:54 localhost systemd[1]: Stopped User Manager for UID 0. Nov 23 03:15:54 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 23 03:15:54 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 23 03:15:54 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 23 03:15:54 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 23 03:15:54 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 23 03:16:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:16:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:16:13 localhost podman[76510]: 2025-11-23 08:16:13.011345777 +0000 UTC m=+0.069569893 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:16:13 localhost podman[76510]: 2025-11-23 08:16:13.017813575 +0000 UTC m=+0.076037701 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:16:13 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:16:13 localhost podman[76509]: 2025-11-23 08:16:13.064076429 +0000 UTC m=+0.121836781 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:16:13 localhost podman[76509]: 2025-11-23 08:16:13.10228908 +0000 UTC m=+0.160049382 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 23 03:16:13 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:16:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:16:15 localhost podman[76548]: 2025-11-23 08:16:15.022046654 +0000 UTC m=+0.075738562 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:16:15 localhost systemd[1]: tmp-crun.sbFxrJ.mount: Deactivated successfully. Nov 23 03:16:15 localhost podman[76548]: 2025-11-23 08:16:15.054712487 +0000 UTC m=+0.108404355 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:16:15 localhost podman[76548]: unhealthy Nov 23 03:16:15 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:16:15 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:16:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:16:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:16:16 localhost recover_tripleo_nova_virtqemud[76594]: 61756 Nov 23 03:16:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:16:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:16:17 localhost systemd[1]: tmp-crun.QzuSQG.mount: Deactivated successfully. Nov 23 03:16:17 localhost podman[76570]: 2025-11-23 08:16:17.026694717 +0000 UTC m=+0.088541261 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 23 03:16:17 localhost podman[76570]: 2025-11-23 08:16:17.032840744 +0000 UTC m=+0.094687328 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, container_name=logrotate_crond, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible) Nov 23 03:16:17 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:16:17 localhost podman[76576]: 2025-11-23 08:16:17.085607197 +0000 UTC m=+0.135908960 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 23 03:16:17 localhost podman[76571]: 2025-11-23 08:16:17.050731377 +0000 UTC m=+0.105651760 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044) Nov 23 03:16:17 localhost podman[76577]: 2025-11-23 08:16:17.132730348 +0000 UTC m=+0.180618508 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:16:17 localhost podman[76576]: 2025-11-23 08:16:17.164970667 +0000 UTC m=+0.215272430 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:16:17 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:16:17 localhost podman[76571]: 2025-11-23 08:16:17.180997734 +0000 UTC m=+0.235918147 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4) Nov 23 03:16:17 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:16:17 localhost podman[76577]: 2025-11-23 08:16:17.459308288 +0000 UTC m=+0.507196438 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com) Nov 23 03:16:17 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:16:18 localhost systemd[1]: tmp-crun.1mZ5ze.mount: Deactivated successfully. Nov 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:16:20 localhost podman[76671]: 2025-11-23 08:16:20.015110891 +0000 UTC m=+0.077374510 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1) Nov 23 03:16:20 localhost podman[76672]: 2025-11-23 08:16:20.030699715 +0000 UTC m=+0.084190128 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044) Nov 23 03:16:20 localhost podman[76673]: 2025-11-23 08:16:20.075230768 +0000 UTC m=+0.128286248 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Nov 23 03:16:20 localhost podman[76672]: 2025-11-23 08:16:20.110071636 +0000 UTC m=+0.163562059 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:16:20 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:16:20 localhost podman[76673]: 2025-11-23 08:16:20.130247309 +0000 UTC m=+0.183302799 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:16:20 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:16:20 localhost podman[76671]: 2025-11-23 08:16:20.235465485 +0000 UTC m=+0.297729064 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true) Nov 23 03:16:20 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:16:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:16:44 localhost systemd[1]: tmp-crun.a8Td3U.mount: Deactivated successfully. Nov 23 03:16:44 localhost podman[76749]: 2025-11-23 08:16:44.030193995 +0000 UTC m=+0.082242379 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git) Nov 23 03:16:44 localhost podman[76749]: 2025-11-23 08:16:44.040248411 +0000 UTC m=+0.092296795 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:16:44 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:16:44 localhost podman[76748]: 2025-11-23 08:16:44.127370017 +0000 UTC m=+0.179837834 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:16:44 localhost podman[76748]: 2025-11-23 08:16:44.141643121 +0000 UTC m=+0.194110928 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3) Nov 23 03:16:44 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:16:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:16:46 localhost podman[76786]: 2025-11-23 08:16:46.019621445 +0000 UTC m=+0.081689102 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, name=rhosp17/openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12) Nov 23 03:16:46 localhost podman[76786]: 2025-11-23 08:16:46.105203635 +0000 UTC m=+0.167271352 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:16:46 localhost podman[76786]: unhealthy Nov 23 03:16:46 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:16:46 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:16:48 localhost podman[76812]: 2025-11-23 08:16:48.011302864 +0000 UTC m=+0.058099416 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:16:48 localhost podman[76811]: 2025-11-23 08:16:48.07374727 +0000 UTC m=+0.119438248 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Nov 23 03:16:48 localhost podman[76810]: 2025-11-23 08:16:48.047294927 +0000 UTC m=+0.093879503 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute) Nov 23 03:16:48 localhost podman[76811]: 2025-11-23 08:16:48.102238627 +0000 UTC m=+0.147929565 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 23 03:16:48 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:16:48 localhost podman[76810]: 2025-11-23 08:16:48.130173355 +0000 UTC m=+0.176757931 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12) Nov 23 03:16:48 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:16:48 localhost podman[76809]: 2025-11-23 08:16:48.143719676 +0000 UTC m=+0.193853809 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:16:48 localhost podman[76809]: 2025-11-23 08:16:48.173995586 +0000 UTC m=+0.224129699 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Nov 23 03:16:48 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:16:48 localhost podman[76812]: 2025-11-23 08:16:48.371468475 +0000 UTC m=+0.418265017 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Nov 23 03:16:48 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:16:49 localhost systemd[1]: tmp-crun.v0QURz.mount: Deactivated successfully. Nov 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:16:51 localhost systemd[1]: tmp-crun.6Un2ST.mount: Deactivated successfully. Nov 23 03:16:51 localhost podman[76897]: 2025-11-23 08:16:51.032099282 +0000 UTC m=+0.090618883 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true) Nov 23 03:16:51 localhost systemd[1]: tmp-crun.wdfeKE.mount: Deactivated successfully. Nov 23 03:16:51 localhost podman[76899]: 2025-11-23 08:16:51.083637668 +0000 UTC m=+0.138618412 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1) Nov 23 03:16:51 localhost podman[76898]: 2025-11-23 08:16:51.131467981 +0000 UTC m=+0.187162386 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent) Nov 23 03:16:51 localhost podman[76899]: 2025-11-23 08:16:51.161296818 +0000 UTC m=+0.216277582 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 23 03:16:51 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:16:51 localhost podman[76898]: 2025-11-23 08:16:51.177979744 +0000 UTC m=+0.233674219 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044) Nov 23 03:16:51 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:16:51 localhost podman[76897]: 2025-11-23 08:16:51.256682725 +0000 UTC m=+0.315202366 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044) Nov 23 03:16:51 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:16:59 localhost sshd[76971]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:17:00 localhost systemd[1]: session-27.scope: Deactivated successfully. Nov 23 03:17:00 localhost systemd[1]: session-27.scope: Consumed 2.993s CPU time. Nov 23 03:17:00 localhost systemd-logind[761]: Session 27 logged out. Waiting for processes to exit. Nov 23 03:17:00 localhost systemd-logind[761]: Removed session 27. Nov 23 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:17:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:17:15 localhost podman[77049]: 2025-11-23 08:17:15.026492213 +0000 UTC m=+0.081377483 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:17:15 localhost podman[77049]: 2025-11-23 08:17:15.044436067 +0000 UTC m=+0.099321317 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:17:15 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:17:15 localhost podman[77050]: 2025-11-23 08:17:15.122167749 +0000 UTC m=+0.177164752 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:17:15 localhost podman[77050]: 2025-11-23 08:17:15.161266136 +0000 UTC m=+0.216263159 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-iscsid, version=17.1.12) Nov 23 03:17:15 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:17:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:17:17 localhost podman[77087]: 2025-11-23 08:17:17.023492012 +0000 UTC m=+0.081503226 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step5, container_name=nova_compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 23 03:17:17 localhost podman[77087]: 2025-11-23 08:17:17.081281988 +0000 UTC m=+0.139293152 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:17:17 localhost podman[77087]: unhealthy Nov 23 03:17:17 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:17:17 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:17:19 localhost podman[77111]: 2025-11-23 08:17:19.040072327 +0000 UTC m=+0.092084358 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:17:19 localhost podman[77111]: 2025-11-23 08:17:19.074491343 +0000 UTC m=+0.126503364 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute) Nov 23 03:17:19 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:17:19 localhost systemd[1]: tmp-crun.gW1i3v.mount: Deactivated successfully. Nov 23 03:17:19 localhost podman[77112]: 2025-11-23 08:17:19.118032886 +0000 UTC m=+0.166889900 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 23 03:17:19 localhost podman[77112]: 2025-11-23 08:17:19.178429681 +0000 UTC m=+0.227286715 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64) Nov 23 03:17:19 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:17:19 localhost podman[77118]: 2025-11-23 08:17:19.190553049 +0000 UTC m=+0.236266928 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 23 03:17:19 localhost podman[77110]: 2025-11-23 08:17:19.239928919 +0000 UTC m=+0.295474757 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:17:19 localhost podman[77110]: 2025-11-23 08:17:19.246332673 +0000 UTC m=+0.301878511 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 23 03:17:19 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:17:19 localhost podman[77118]: 2025-11-23 08:17:19.560375432 +0000 UTC m=+0.606089331 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, tcib_managed=true) Nov 23 03:17:19 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:17:22 localhost systemd[1]: tmp-crun.jsookL.mount: Deactivated successfully. Nov 23 03:17:22 localhost systemd[1]: tmp-crun.FERFqT.mount: Deactivated successfully. Nov 23 03:17:22 localhost podman[77205]: 2025-11-23 08:17:22.036028503 +0000 UTC m=+0.095212153 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 23 03:17:22 localhost podman[77206]: 2025-11-23 08:17:22.004101273 +0000 UTC m=+0.061281173 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true) Nov 23 03:17:22 localhost podman[77206]: 2025-11-23 08:17:22.086355822 +0000 UTC m=+0.143535772 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, version=17.1.12, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:17:22 localhost podman[77207]: 2025-11-23 08:17:22.095983554 +0000 UTC m=+0.148821782 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 23 03:17:22 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:17:22 localhost podman[77207]: 2025-11-23 08:17:22.116491076 +0000 UTC m=+0.169329294 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:17:22 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:17:22 localhost podman[77205]: 2025-11-23 08:17:22.253499028 +0000 UTC m=+0.312682728 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:17:22 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:17:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:17:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:17:46 localhost podman[77287]: 2025-11-23 08:17:46.027304268 +0000 UTC m=+0.088644794 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd) Nov 23 03:17:46 localhost podman[77287]: 2025-11-23 08:17:46.040487489 +0000 UTC m=+0.101828005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z) Nov 23 03:17:46 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:17:46 localhost podman[77288]: 2025-11-23 08:17:46.004061022 +0000 UTC m=+0.068440460 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:17:46 localhost podman[77288]: 2025-11-23 08:17:46.084187665 +0000 UTC m=+0.148567093 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4) Nov 23 03:17:46 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:17:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:17:48 localhost podman[77327]: 2025-11-23 08:17:48.026099253 +0000 UTC m=+0.083769856 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute) Nov 23 03:17:48 localhost podman[77327]: 2025-11-23 08:17:48.085837588 +0000 UTC m=+0.143508191 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public) Nov 23 03:17:48 localhost podman[77327]: unhealthy Nov 23 03:17:48 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:17:48 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:17:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:17:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:17:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:17:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:17:50 localhost podman[77348]: 2025-11-23 08:17:50.03714348 +0000 UTC m=+0.090973014 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 23 03:17:50 localhost podman[77348]: 2025-11-23 08:17:50.047297869 +0000 UTC m=+0.101127483 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, version=17.1.12, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, vcs-type=git, architecture=x86_64) Nov 23 03:17:50 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:17:50 localhost systemd[1]: tmp-crun.CoLnVf.mount: Deactivated successfully. Nov 23 03:17:50 localhost podman[77350]: 2025-11-23 08:17:50.088247822 +0000 UTC m=+0.136876118 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 03:17:50 localhost podman[77350]: 2025-11-23 08:17:50.109194079 +0000 UTC m=+0.157822365 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:17:50 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:17:50 localhost podman[77349]: 2025-11-23 08:17:50.193043376 +0000 UTC m=+0.243854408 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute) Nov 23 03:17:50 localhost podman[77349]: 2025-11-23 08:17:50.243047235 +0000 UTC m=+0.293858217 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, name=rhosp17/openstack-ceilometer-compute) Nov 23 03:17:50 localhost podman[77351]: 2025-11-23 08:17:50.251046888 +0000 UTC m=+0.291491456 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:17:50 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:17:50 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:17:50 localhost podman[77351]: 2025-11-23 08:17:50.650716048 +0000 UTC m=+0.691160626 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:17:50 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:17:50 localhost recover_tripleo_nova_virtqemud[77441]: 61756 Nov 23 03:17:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:17:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:17:53 localhost podman[77444]: 2025-11-23 08:17:53.021127491 +0000 UTC m=+0.073143862 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 23 03:17:53 localhost podman[77444]: 2025-11-23 08:17:53.071248324 +0000 UTC m=+0.123264685 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, url=https://www.redhat.com) Nov 23 03:17:53 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:17:53 localhost podman[77443]: 2025-11-23 08:17:53.071876323 +0000 UTC m=+0.125494963 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:17:53 localhost podman[77442]: 2025-11-23 08:17:53.136684412 +0000 UTC m=+0.190959822 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team) Nov 23 03:17:53 localhost podman[77443]: 2025-11-23 08:17:53.157266487 +0000 UTC m=+0.210885077 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 23 03:17:53 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:17:53 localhost podman[77442]: 2025-11-23 08:17:53.331176629 +0000 UTC m=+0.385452019 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public) Nov 23 03:17:53 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:17:54 localhost systemd[1]: tmp-crun.UnHzMz.mount: Deactivated successfully. Nov 23 03:18:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:18:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:18:17 localhost podman[77597]: 2025-11-23 08:18:17.081530963 +0000 UTC m=+0.098942848 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Nov 23 03:18:17 localhost podman[77598]: 2025-11-23 08:18:17.143394788 +0000 UTC m=+0.158991792 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:18:17 localhost podman[77597]: 2025-11-23 08:18:17.168430098 +0000 UTC m=+0.185841993 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12) Nov 23 03:18:17 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:18:17 localhost podman[77598]: 2025-11-23 08:18:17.183360281 +0000 UTC m=+0.198957285 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 23 03:18:17 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:18:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:18:19 localhost podman[77633]: 2025-11-23 08:18:19.028920862 +0000 UTC m=+0.082710357 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:18:19 localhost podman[77633]: 2025-11-23 08:18:19.095298906 +0000 UTC m=+0.149088381 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Nov 23 03:18:19 localhost podman[77633]: unhealthy Nov 23 03:18:19 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:18:19 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:18:21 localhost systemd[1]: tmp-crun.hEBUdc.mount: Deactivated successfully. Nov 23 03:18:21 localhost podman[77658]: 2025-11-23 08:18:21.032084493 +0000 UTC m=+0.084342064 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:18:21 localhost podman[77656]: 2025-11-23 08:18:21.009452311 +0000 UTC m=+0.067197348 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 23 03:18:21 localhost podman[77655]: 2025-11-23 08:18:21.111559568 +0000 UTC m=+0.172343551 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:18:21 localhost podman[77655]: 2025-11-23 08:18:21.118411762 +0000 UTC m=+0.179195805 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team) Nov 23 03:18:21 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:18:21 localhost podman[77657]: 2025-11-23 08:18:21.175749099 +0000 UTC m=+0.230128751 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 23 03:18:21 localhost podman[77656]: 2025-11-23 08:18:21.193990036 +0000 UTC m=+0.251735143 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public) Nov 23 03:18:21 localhost podman[77657]: 2025-11-23 08:18:21.203435434 +0000 UTC m=+0.257815096 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:18:21 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:18:21 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:18:21 localhost podman[77658]: 2025-11-23 08:18:21.371868643 +0000 UTC m=+0.424126264 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:18:21 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:18:24 localhost systemd[1]: tmp-crun.D0HOYu.mount: Deactivated successfully. Nov 23 03:18:24 localhost podman[77753]: 2025-11-23 08:18:24.023464462 +0000 UTC m=+0.080031341 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:18:24 localhost systemd[1]: tmp-crun.JP09Qb.mount: Deactivated successfully. Nov 23 03:18:24 localhost podman[77755]: 2025-11-23 08:18:24.074966973 +0000 UTC m=+0.124963696 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:18:24 localhost podman[77755]: 2025-11-23 08:18:24.119600949 +0000 UTC m=+0.169597652 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller) Nov 23 03:18:24 localhost podman[77754]: 2025-11-23 08:18:24.132048092 +0000 UTC m=+0.183972410 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4) Nov 23 03:18:24 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:18:24 localhost podman[77754]: 2025-11-23 08:18:24.176538745 +0000 UTC m=+0.228462973 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4) Nov 23 03:18:24 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:18:24 localhost podman[77753]: 2025-11-23 08:18:24.21830219 +0000 UTC m=+0.274869089 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr) Nov 23 03:18:24 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:18:25 localhost systemd[1]: tmp-crun.z9PkSm.mount: Deactivated successfully. Nov 23 03:18:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:18:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:18:47 localhost podman[77858]: 2025-11-23 08:18:47.326408124 +0000 UTC m=+0.077097057 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:18:47 localhost systemd[1]: tmp-crun.42wpaN.mount: Deactivated successfully. Nov 23 03:18:47 localhost podman[77858]: 2025-11-23 08:18:47.369730604 +0000 UTC m=+0.120419517 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:18:47 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:18:47 localhost podman[77859]: 2025-11-23 08:18:47.3709918 +0000 UTC m=+0.118531664 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, tcib_managed=true, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:18:47 localhost podman[77859]: 2025-11-23 08:18:47.45736934 +0000 UTC m=+0.204909234 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 23 03:18:47 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:18:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:18:50 localhost systemd[1]: tmp-crun.A5TCAd.mount: Deactivated successfully. Nov 23 03:18:50 localhost podman[77961]: 2025-11-23 08:18:50.025600384 +0000 UTC m=+0.083582013 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5) Nov 23 03:18:50 localhost podman[77961]: 2025-11-23 08:18:50.059438804 +0000 UTC m=+0.117420503 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 23 03:18:50 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:18:52 localhost podman[77989]: 2025-11-23 08:18:52.035821946 +0000 UTC m=+0.087716211 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4) Nov 23 03:18:52 localhost podman[77987]: 2025-11-23 08:18:52.081433599 +0000 UTC m=+0.140919078 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:18:52 localhost podman[77989]: 2025-11-23 08:18:52.093331987 +0000 UTC m=+0.145226282 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:18:52 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:18:52 localhost podman[77988]: 2025-11-23 08:18:52.156979843 +0000 UTC m=+0.211944385 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:18:52 localhost podman[77987]: 2025-11-23 08:18:52.168340035 +0000 UTC m=+0.227825464 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:18:52 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:18:52 localhost podman[77988]: 2025-11-23 08:18:52.191230804 +0000 UTC m=+0.246195366 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z) Nov 23 03:18:52 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:18:52 localhost podman[77995]: 2025-11-23 08:18:52.247545602 +0000 UTC m=+0.297008728 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:18:52 localhost podman[77995]: 2025-11-23 08:18:52.610671194 +0000 UTC m=+0.660134330 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:18:52 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:18:53 localhost systemd[1]: tmp-crun.98bcJT.mount: Deactivated successfully. Nov 23 03:18:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:18:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:18:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:18:55 localhost systemd[1]: tmp-crun.UsObRt.mount: Deactivated successfully. Nov 23 03:18:55 localhost podman[78085]: 2025-11-23 08:18:55.038054321 +0000 UTC m=+0.094957285 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:18:55 localhost podman[78087]: 2025-11-23 08:18:55.084108518 +0000 UTC m=+0.136361910 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:18:55 localhost podman[78087]: 2025-11-23 08:18:55.112155374 +0000 UTC m=+0.164408766 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Nov 23 03:18:55 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:18:55 localhost podman[78086]: 2025-11-23 08:18:55.191428022 +0000 UTC m=+0.245739413 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 23 03:18:55 localhost podman[78086]: 2025-11-23 08:18:55.230368037 +0000 UTC m=+0.284679478 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 23 03:18:55 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:18:55 localhost podman[78085]: 2025-11-23 08:18:55.265306938 +0000 UTC m=+0.322209822 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044) Nov 23 03:18:55 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:18:56 localhost systemd[1]: tmp-crun.KA7vkC.mount: Deactivated successfully. Nov 23 03:18:57 localhost systemd[1]: libpod-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope: Deactivated successfully. Nov 23 03:18:57 localhost podman[78157]: 2025-11-23 08:18:57.49050206 +0000 UTC m=+0.064244424 container died 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_wait_for_compute_service, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:18:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8-userdata-shm.mount: Deactivated successfully. Nov 23 03:18:57 localhost systemd[1]: var-lib-containers-storage-overlay-82705214f78b7282f0a9b76165b6c261dc19fa9792e62f606d74575b20153483-merged.mount: Deactivated successfully. Nov 23 03:18:57 localhost podman[78157]: 2025-11-23 08:18:57.524085273 +0000 UTC m=+0.097827587 container cleanup 0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:18:57 localhost systemd[1]: libpod-conmon-0edc90bbc4296e62480ca95161a663093e20fec9952cc4aef1f817f63fa308d8.scope: Deactivated successfully. Nov 23 03:18:57 localhost python3[76083]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=39370c45b6a27bfda1ebe1fb9d328c43 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 23 03:18:58 localhost python3[78210]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:18:58 localhost python3[78226]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 23 03:18:59 localhost python3[78287]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1763885938.4695463-118615-85218611173393/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:18:59 localhost python3[78303]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 03:18:59 localhost systemd[1]: Reloading. Nov 23 03:18:59 localhost systemd-rc-local-generator[78328]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:18:59 localhost systemd-sysv-generator[78334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:18:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:18:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:18:59 localhost recover_tripleo_nova_virtqemud[78341]: 61756 Nov 23 03:18:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:18:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:19:00 localhost python3[78357]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:19:00 localhost systemd[1]: Reloading. Nov 23 03:19:00 localhost systemd-sysv-generator[78387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:19:00 localhost systemd-rc-local-generator[78382]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:19:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:19:00 localhost systemd[1]: Starting nova_compute container... Nov 23 03:19:01 localhost tripleo-start-podman-container[78397]: Creating additional drop-in dependency for "nova_compute" (e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce) Nov 23 03:19:01 localhost systemd[1]: Reloading. Nov 23 03:19:01 localhost systemd-rc-local-generator[78451]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:19:01 localhost systemd-sysv-generator[78458]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:19:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:19:01 localhost systemd[1]: Started nova_compute container. Nov 23 03:19:01 localhost systemd[1]: Starting dnf makecache... Nov 23 03:19:01 localhost dnf[78466]: Updating Subscription Management repositories. Nov 23 03:19:01 localhost python3[78496]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:19:03 localhost dnf[78466]: Metadata cache refreshed recently. Nov 23 03:19:03 localhost python3[78618]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005532585 step=5 update_config_hash_only=False Nov 23 03:19:03 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 23 03:19:03 localhost systemd[1]: Finished dnf makecache. Nov 23 03:19:03 localhost systemd[1]: dnf-makecache.service: Consumed 1.953s CPU time. Nov 23 03:19:03 localhost python3[78634]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 03:19:04 localhost python3[78650]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 23 03:19:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:19:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:19:18 localhost podman[78729]: 2025-11-23 08:19:18.034654444 +0000 UTC m=+0.086593238 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:19:18 localhost podman[78729]: 2025-11-23 08:19:18.071222841 +0000 UTC m=+0.123161625 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:19:18 localhost podman[78728]: 2025-11-23 08:19:18.082885282 +0000 UTC m=+0.135160555 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible) Nov 23 03:19:18 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:19:18 localhost podman[78728]: 2025-11-23 08:19:18.098245488 +0000 UTC m=+0.150520751 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:19:18 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:19:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:19:21 localhost podman[78767]: 2025-11-23 08:19:21.024113967 +0000 UTC m=+0.080122243 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044) Nov 23 03:19:21 localhost podman[78767]: 2025-11-23 08:19:21.056333742 +0000 UTC m=+0.112342018 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 23 03:19:21 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:19:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:19:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:19:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:19:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:19:23 localhost podman[78793]: 2025-11-23 08:19:23.030580023 +0000 UTC m=+0.085990821 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=) Nov 23 03:19:23 localhost podman[78793]: 2025-11-23 08:19:23.039397953 +0000 UTC m=+0.094808721 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12) Nov 23 03:19:23 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:19:23 localhost systemd[1]: tmp-crun.nnnJNR.mount: Deactivated successfully. Nov 23 03:19:23 localhost podman[78794]: 2025-11-23 08:19:23.099192639 +0000 UTC m=+0.152982700 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:19:23 localhost podman[78794]: 2025-11-23 08:19:23.127846352 +0000 UTC m=+0.181636473 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12) Nov 23 03:19:23 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:19:23 localhost podman[78795]: 2025-11-23 08:19:23.148863739 +0000 UTC m=+0.198490893 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 23 03:19:23 localhost podman[78799]: 2025-11-23 08:19:23.198748564 +0000 UTC m=+0.244526319 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:19:23 localhost podman[78795]: 2025-11-23 08:19:23.207290546 +0000 UTC m=+0.256917750 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:19:23 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:19:23 localhost podman[78799]: 2025-11-23 08:19:23.581078031 +0000 UTC m=+0.626855776 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:19:23 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:19:24 localhost systemd[1]: tmp-crun.qZPxba.mount: Deactivated successfully. Nov 23 03:19:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:19:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:19:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:19:26 localhost systemd[1]: tmp-crun.lcSStK.mount: Deactivated successfully. Nov 23 03:19:26 localhost podman[78885]: 2025-11-23 08:19:26.037299336 +0000 UTC m=+0.096929701 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, container_name=metrics_qdr, distribution-scope=public) Nov 23 03:19:26 localhost podman[78886]: 2025-11-23 08:19:26.002503619 +0000 UTC m=+0.061671670 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z) Nov 23 03:19:26 localhost podman[78886]: 2025-11-23 08:19:26.088345254 +0000 UTC m=+0.147513275 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:19:26 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:19:26 localhost podman[78887]: 2025-11-23 08:19:26.14354711 +0000 UTC m=+0.195027234 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:19:26 localhost podman[78887]: 2025-11-23 08:19:26.165158394 +0000 UTC m=+0.216638528 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 23 03:19:26 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:19:26 localhost podman[78885]: 2025-11-23 08:19:26.221733119 +0000 UTC m=+0.281363544 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step1, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:19:26 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:19:32 localhost sshd[78959]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:19:32 localhost systemd-logind[761]: New session 33 of user zuul. Nov 23 03:19:32 localhost systemd[1]: Started Session 33 of User zuul. Nov 23 03:19:33 localhost python3[79068]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 03:19:40 localhost python3[79331]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Nov 23 03:19:47 localhost python3[79449]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Nov 23 03:19:47 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Nov 23 03:19:47 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Nov 23 03:19:47 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 03:19:47 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 03:19:47 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 03:19:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:19:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:19:49 localhost podman[79494]: 2025-11-23 08:19:49.039995312 +0000 UTC m=+0.092951058 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, distribution-scope=public, version=17.1.12, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-type=git) Nov 23 03:19:49 localhost podman[79493]: 2025-11-23 08:19:49.080353298 +0000 UTC m=+0.132971004 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3) Nov 23 03:19:49 localhost podman[79494]: 2025-11-23 08:19:49.108681451 +0000 UTC m=+0.161637197 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-type=git, version=17.1.12, tcib_managed=true, distribution-scope=public) Nov 23 03:19:49 localhost podman[79493]: 2025-11-23 08:19:49.1181729 +0000 UTC m=+0.170790596 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z) Nov 23 03:19:49 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:19:49 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:19:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:19:52 localhost podman[79535]: 2025-11-23 08:19:52.020106551 +0000 UTC m=+0.077658424 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 23 03:19:52 localhost podman[79535]: 2025-11-23 08:19:52.050261197 +0000 UTC m=+0.107813060 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:19:52 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:19:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:19:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:19:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:19:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:19:54 localhost systemd[1]: tmp-crun.wrUks9.mount: Deactivated successfully. Nov 23 03:19:54 localhost podman[79562]: 2025-11-23 08:19:54.02426192 +0000 UTC m=+0.081627687 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, name=rhosp17/openstack-cron) Nov 23 03:19:54 localhost podman[79564]: 2025-11-23 08:19:54.055873627 +0000 UTC m=+0.104614649 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 23 03:19:54 localhost podman[79563]: 2025-11-23 08:19:54.076863712 +0000 UTC m=+0.130038880 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public) Nov 23 03:19:54 localhost podman[79570]: 2025-11-23 08:19:54.133283253 +0000 UTC m=+0.179251527 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Nov 23 03:19:54 localhost podman[79564]: 2025-11-23 08:19:54.139397776 +0000 UTC m=+0.188138818 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:19:54 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:19:54 localhost podman[79562]: 2025-11-23 08:19:54.157386217 +0000 UTC m=+0.214751984 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:19:54 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:19:54 localhost podman[79563]: 2025-11-23 08:19:54.209001681 +0000 UTC m=+0.262176849 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:19:54 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:19:54 localhost podman[79570]: 2025-11-23 08:19:54.487354199 +0000 UTC m=+0.533322483 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:19:54 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:19:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:19:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:19:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:19:57 localhost podman[79657]: 2025-11-23 08:19:57.034775081 +0000 UTC m=+0.090049755 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1) Nov 23 03:19:57 localhost systemd[1]: tmp-crun.047zH9.mount: Deactivated successfully. Nov 23 03:19:57 localhost podman[79659]: 2025-11-23 08:19:57.101462053 +0000 UTC m=+0.151052916 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044) Nov 23 03:19:57 localhost podman[79659]: 2025-11-23 08:19:57.129214171 +0000 UTC m=+0.178805024 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, build-date=2025-11-18T23:34:05Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:19:57 localhost podman[79658]: 2025-11-23 08:19:57.141343394 +0000 UTC m=+0.193864471 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:19:57 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:19:57 localhost podman[79658]: 2025-11-23 08:19:57.174451794 +0000 UTC m=+0.226972910 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:19:57 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:19:57 localhost podman[79657]: 2025-11-23 08:19:57.273446282 +0000 UTC m=+0.328720906 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:19:57 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:20:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:20:16 localhost recover_tripleo_nova_virtqemud[79811]: 61756 Nov 23 03:20:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:20:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:20:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:20:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:20:20 localhost podman[79812]: 2025-11-23 08:20:20.054186471 +0000 UTC m=+0.103513788 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:20:20 localhost podman[79812]: 2025-11-23 08:20:20.064153374 +0000 UTC m=+0.113480741 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc.) Nov 23 03:20:20 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:20:20 localhost podman[79813]: 2025-11-23 08:20:20.151280796 +0000 UTC m=+0.200593562 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:20:20 localhost podman[79813]: 2025-11-23 08:20:20.188288956 +0000 UTC m=+0.237601692 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=iscsid, build-date=2025-11-18T23:44:13Z) Nov 23 03:20:20 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:20:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:20:23 localhost podman[79851]: 2025-11-23 08:20:23.049729967 +0000 UTC m=+0.102418116 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:20:23 localhost podman[79851]: 2025-11-23 08:20:23.107544587 +0000 UTC m=+0.160232746 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:20:23 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:20:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:20:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:20:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:20:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:20:25 localhost systemd[1]: tmp-crun.5hsHES.mount: Deactivated successfully. Nov 23 03:20:25 localhost podman[79877]: 2025-11-23 08:20:25.011743402 +0000 UTC m=+0.070770158 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:20:25 localhost podman[79877]: 2025-11-23 08:20:25.023183487 +0000 UTC m=+0.082210273 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond) Nov 23 03:20:25 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:20:25 localhost systemd[1]: tmp-crun.kGNdKt.mount: Deactivated successfully. Nov 23 03:20:25 localhost podman[79878]: 2025-11-23 08:20:25.064808018 +0000 UTC m=+0.122184108 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 23 03:20:25 localhost podman[79878]: 2025-11-23 08:20:25.092127412 +0000 UTC m=+0.149503432 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:20:25 localhost podman[79880]: 2025-11-23 08:20:25.098954486 +0000 UTC m=+0.154247527 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 23 03:20:25 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:20:25 localhost podman[79879]: 2025-11-23 08:20:25.161102639 +0000 UTC m=+0.216613396 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:20:25 localhost podman[79879]: 2025-11-23 08:20:25.205071537 +0000 UTC m=+0.260582304 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:20:25 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:20:25 localhost podman[79880]: 2025-11-23 08:20:25.431666256 +0000 UTC m=+0.486959357 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_migration_target) Nov 23 03:20:25 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:20:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:20:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:20:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:20:28 localhost systemd[1]: tmp-crun.rbhSeb.mount: Deactivated successfully. Nov 23 03:20:28 localhost podman[79969]: 2025-11-23 08:20:28.023648983 +0000 UTC m=+0.079001012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 23 03:20:28 localhost podman[79969]: 2025-11-23 08:20:28.118081652 +0000 UTC m=+0.173433641 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:20:28 localhost podman[79968]: 2025-11-23 08:20:28.124122743 +0000 UTC m=+0.182844788 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:20:28 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:20:28 localhost podman[79970]: 2025-11-23 08:20:28.169806819 +0000 UTC m=+0.222952385 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:20:28 localhost podman[79970]: 2025-11-23 08:20:28.186096622 +0000 UTC m=+0.239242118 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:20:28 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:20:28 localhost podman[79968]: 2025-11-23 08:20:28.315496443 +0000 UTC m=+0.374218498 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, vcs-type=git) Nov 23 03:20:28 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:20:29 localhost systemd[1]: tmp-crun.CjKvx6.mount: Deactivated successfully. Nov 23 03:20:47 localhost systemd[1]: session-33.scope: Deactivated successfully. Nov 23 03:20:47 localhost systemd[1]: session-33.scope: Consumed 5.718s CPU time. Nov 23 03:20:47 localhost systemd-logind[761]: Session 33 logged out. Waiting for processes to exit. Nov 23 03:20:47 localhost systemd-logind[761]: Removed session 33. Nov 23 03:20:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:20:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:20:51 localhost systemd[1]: tmp-crun.ACNw0k.mount: Deactivated successfully. Nov 23 03:20:51 localhost podman[80089]: 2025-11-23 08:20:51.0081017 +0000 UTC m=+0.071777618 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:51:28Z, container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com) Nov 23 03:20:51 localhost podman[80089]: 2025-11-23 08:20:51.014568923 +0000 UTC m=+0.078244861 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd) Nov 23 03:20:51 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:20:51 localhost podman[80090]: 2025-11-23 08:20:51.041198278 +0000 UTC m=+0.100961225 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible) Nov 23 03:20:51 localhost podman[80090]: 2025-11-23 08:20:51.046935821 +0000 UTC m=+0.106698788 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:20:51 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:20:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:20:54 localhost podman[80128]: 2025-11-23 08:20:54.024525698 +0000 UTC m=+0.081961656 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step5, release=1761123044, vcs-type=git) Nov 23 03:20:54 localhost podman[80128]: 2025-11-23 08:20:54.048713145 +0000 UTC m=+0.106149063 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:20:54 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:20:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:20:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:20:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:20:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:20:56 localhost systemd[1]: tmp-crun.JMHFHX.mount: Deactivated successfully. Nov 23 03:20:56 localhost podman[80156]: 2025-11-23 08:20:56.040195145 +0000 UTC m=+0.093152034 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 23 03:20:56 localhost systemd[1]: tmp-crun.Z36nxq.mount: Deactivated successfully. Nov 23 03:20:56 localhost podman[80155]: 2025-11-23 08:20:56.092508979 +0000 UTC m=+0.145732985 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:20:56 localhost podman[80154]: 2025-11-23 08:20:56.144177215 +0000 UTC m=+0.199498941 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4) Nov 23 03:20:56 localhost podman[80155]: 2025-11-23 08:20:56.150405121 +0000 UTC m=+0.203629157 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:20:56 localhost podman[80154]: 2025-11-23 08:20:56.150911935 +0000 UTC m=+0.206233641 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:20:56 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:20:56 localhost podman[80157]: 2025-11-23 08:20:56.190074436 +0000 UTC m=+0.235674477 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, version=17.1.12) Nov 23 03:20:56 localhost podman[80156]: 2025-11-23 08:20:56.201008427 +0000 UTC m=+0.253965396 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true) Nov 23 03:20:56 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:20:56 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:20:56 localhost podman[80157]: 2025-11-23 08:20:56.550387499 +0000 UTC m=+0.595987520 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4) Nov 23 03:20:56 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:20:58 localhost sshd[80246]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:20:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:20:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:20:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:20:58 localhost systemd-logind[761]: New session 34 of user zuul. Nov 23 03:20:58 localhost systemd[1]: Started Session 34 of User zuul. Nov 23 03:20:58 localhost systemd[1]: tmp-crun.WCEdOU.mount: Deactivated successfully. Nov 23 03:20:58 localhost podman[80255]: 2025-11-23 08:20:58.477577195 +0000 UTC m=+0.132548071 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:20:58 localhost podman[80248]: 2025-11-23 08:20:58.532776142 +0000 UTC m=+0.196793485 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:20:58 localhost podman[80249]: 2025-11-23 08:20:58.449488138 +0000 UTC m=+0.108153909 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4) Nov 23 03:20:58 localhost podman[80248]: 2025-11-23 08:20:58.566255831 +0000 UTC m=+0.230273154 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:20:58 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:20:58 localhost podman[80249]: 2025-11-23 08:20:58.58418488 +0000 UTC m=+0.242850661 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible) Nov 23 03:20:58 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:20:58 localhost podman[80255]: 2025-11-23 08:20:58.671155567 +0000 UTC m=+0.326126383 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, architecture=x86_64, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:20:58 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:20:58 localhost python3[80335]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 03:21:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:21:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:21:22 localhost podman[80420]: 2025-11-23 08:21:22.025463429 +0000 UTC m=+0.082460141 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:21:22 localhost podman[80420]: 2025-11-23 08:21:22.061165251 +0000 UTC m=+0.118161973 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64) Nov 23 03:21:22 localhost systemd[1]: tmp-crun.33AQVw.mount: Deactivated successfully. Nov 23 03:21:22 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:21:22 localhost podman[80419]: 2025-11-23 08:21:22.080961644 +0000 UTC m=+0.139262983 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc.) Nov 23 03:21:22 localhost podman[80419]: 2025-11-23 08:21:22.090249977 +0000 UTC m=+0.148551306 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:21:22 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:21:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:21:25 localhost podman[80459]: 2025-11-23 08:21:25.031026872 +0000 UTC m=+0.086284930 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Nov 23 03:21:25 localhost podman[80459]: 2025-11-23 08:21:25.060116696 +0000 UTC m=+0.115374754 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:21:25 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:21:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:21:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:21:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:21:26 localhost systemd[1]: tmp-crun.wEZHos.mount: Deactivated successfully. Nov 23 03:21:26 localhost podman[80499]: 2025-11-23 08:21:26.494522161 +0000 UTC m=+0.061179287 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z) Nov 23 03:21:26 localhost podman[80501]: 2025-11-23 08:21:26.567544773 +0000 UTC m=+0.130769742 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:21:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:21:26 localhost podman[80499]: 2025-11-23 08:21:26.57628099 +0000 UTC m=+0.142938096 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com) Nov 23 03:21:26 localhost podman[80502]: 2025-11-23 08:21:26.573361018 +0000 UTC m=+0.131880183 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Nov 23 03:21:26 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:21:26 localhost podman[80501]: 2025-11-23 08:21:26.619267661 +0000 UTC m=+0.182492680 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:21:26 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:21:26 localhost python3[80500]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 23 03:21:26 localhost podman[80502]: 2025-11-23 08:21:26.660225992 +0000 UTC m=+0.218745227 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:21:26 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:21:26 localhost podman[80557]: 2025-11-23 08:21:26.624384375 +0000 UTC m=+0.046380256 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git) Nov 23 03:21:27 localhost podman[80557]: 2025-11-23 08:21:27.010743476 +0000 UTC m=+0.432739347 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_migration_target, distribution-scope=public, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container) Nov 23 03:21:27 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:21:27 localhost systemd[1]: tmp-crun.uhFfKp.mount: Deactivated successfully. Nov 23 03:21:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:21:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:21:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:21:29 localhost podman[80599]: 2025-11-23 08:21:29.006700183 +0000 UTC m=+0.067171647 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12) Nov 23 03:21:29 localhost systemd[1]: tmp-crun.TVDqaD.mount: Deactivated successfully. Nov 23 03:21:29 localhost podman[80600]: 2025-11-23 08:21:29.0559754 +0000 UTC m=+0.112732099 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, vcs-type=git) Nov 23 03:21:29 localhost podman[80599]: 2025-11-23 08:21:29.068361122 +0000 UTC m=+0.128832566 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:21:29 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:21:29 localhost systemd[1]: tmp-crun.w8ux2N.mount: Deactivated successfully. Nov 23 03:21:29 localhost podman[80598]: 2025-11-23 08:21:29.119268636 +0000 UTC m=+0.179670978 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 23 03:21:29 localhost podman[80600]: 2025-11-23 08:21:29.171724165 +0000 UTC m=+0.228480894 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller) Nov 23 03:21:29 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:21:29 localhost podman[80598]: 2025-11-23 08:21:29.345302739 +0000 UTC m=+0.405705091 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:21:29 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:21:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 03:21:30 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 03:21:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 03:21:30 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 03:21:30 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 03:21:30 localhost systemd[1]: run-ra5997fd30680444abb12eee4ba2ee058.service: Deactivated successfully. Nov 23 03:21:30 localhost systemd[1]: run-r343d4ce72bbd4bcb8d220b30f52876b7.service: Deactivated successfully. Nov 23 03:21:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:21:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:21:53 localhost podman[80870]: 2025-11-23 08:21:53.022099409 +0000 UTC m=+0.079825615 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=) Nov 23 03:21:53 localhost podman[80870]: 2025-11-23 08:21:53.055675482 +0000 UTC m=+0.113401668 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:21:53 localhost podman[80869]: 2025-11-23 08:21:53.078523721 +0000 UTC m=+0.135424594 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:21:53 localhost podman[80869]: 2025-11-23 08:21:53.112187096 +0000 UTC m=+0.169087979 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:21:53 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:21:53 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:21:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:21:56 localhost systemd[1]: tmp-crun.iyf1wz.mount: Deactivated successfully. Nov 23 03:21:56 localhost podman[80910]: 2025-11-23 08:21:56.012001376 +0000 UTC m=+0.069629566 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:21:56 localhost podman[80910]: 2025-11-23 08:21:56.040660359 +0000 UTC m=+0.098288559 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible) Nov 23 03:21:56 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:21:57 localhost systemd[1]: tmp-crun.DhoIIM.mount: Deactivated successfully. Nov 23 03:21:57 localhost podman[80937]: 2025-11-23 08:21:57.021053823 +0000 UTC m=+0.078603590 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:21:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:21:57 localhost podman[80937]: 2025-11-23 08:21:57.057873718 +0000 UTC m=+0.115423475 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:21:57 localhost systemd[1]: tmp-crun.cVek3J.mount: Deactivated successfully. Nov 23 03:21:57 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:21:57 localhost podman[80938]: 2025-11-23 08:21:57.070123255 +0000 UTC m=+0.124760000 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Nov 23 03:21:57 localhost podman[80939]: 2025-11-23 08:21:57.115407171 +0000 UTC m=+0.167141984 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=) Nov 23 03:21:57 localhost podman[80938]: 2025-11-23 08:21:57.127285177 +0000 UTC m=+0.181921912 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:21:57 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:21:57 localhost podman[80939]: 2025-11-23 08:21:57.139989527 +0000 UTC m=+0.191724340 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Nov 23 03:21:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:21:57 localhost podman[80976]: 2025-11-23 08:21:57.18625929 +0000 UTC m=+0.141714811 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 23 03:21:57 localhost podman[80976]: 2025-11-23 08:21:57.547709505 +0000 UTC m=+0.503164986 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044) Nov 23 03:21:57 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:21:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:21:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:21:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:21:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:21:59 localhost recover_tripleo_nova_virtqemud[81048]: 61756 Nov 23 03:21:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:21:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:22:00 localhost systemd[1]: tmp-crun.24w6qu.mount: Deactivated successfully. Nov 23 03:22:00 localhost podman[81030]: 2025-11-23 08:22:00.039220061 +0000 UTC m=+0.094934634 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com) Nov 23 03:22:00 localhost systemd[1]: tmp-crun.bue58g.mount: Deactivated successfully. Nov 23 03:22:00 localhost podman[81032]: 2025-11-23 08:22:00.14035189 +0000 UTC m=+0.192448250 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git) Nov 23 03:22:00 localhost podman[81032]: 2025-11-23 08:22:00.172337408 +0000 UTC m=+0.224433778 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:22:00 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:22:00 localhost podman[81031]: 2025-11-23 08:22:00.229108009 +0000 UTC m=+0.283572006 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, release=1761123044) Nov 23 03:22:00 localhost podman[81031]: 2025-11-23 08:22:00.296574712 +0000 UTC m=+0.351038679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:22:00 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:22:00 localhost podman[81030]: 2025-11-23 08:22:00.347310692 +0000 UTC m=+0.403025275 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=) Nov 23 03:22:00 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:22:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:22:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4446 writes, 20K keys, 4446 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4446 writes, 451 syncs, 9.86 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:22:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:22:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5196 writes, 22K keys, 5196 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5196 writes, 612 syncs, 8.49 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:22:09 localhost sshd[81105]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:22:10 localhost python3[81122]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:22:13 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 03:22:13 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 03:22:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:22:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:22:24 localhost podman[81388]: 2025-11-23 08:22:24.00209739 +0000 UTC m=+0.058101649 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:22:24 localhost podman[81388]: 2025-11-23 08:22:24.013231585 +0000 UTC m=+0.069235864 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 23 03:22:24 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:22:24 localhost podman[81389]: 2025-11-23 08:22:24.04827564 +0000 UTC m=+0.098923128 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:22:24 localhost podman[81389]: 2025-11-23 08:22:24.08634876 +0000 UTC m=+0.136996298 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044) Nov 23 03:22:24 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:22:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:22:27 localhost podman[81428]: 2025-11-23 08:22:27.016821499 +0000 UTC m=+0.073104905 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1) Nov 23 03:22:27 localhost podman[81428]: 2025-11-23 08:22:27.045212185 +0000 UTC m=+0.101495541 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:22:27 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:22:28 localhost systemd[1]: tmp-crun.BMTKyZ.mount: Deactivated successfully. Nov 23 03:22:28 localhost podman[81455]: 2025-11-23 08:22:28.041732177 +0000 UTC m=+0.093658147 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1) Nov 23 03:22:28 localhost systemd[1]: tmp-crun.nYjnMs.mount: Deactivated successfully. Nov 23 03:22:28 localhost podman[81454]: 2025-11-23 08:22:28.090944124 +0000 UTC m=+0.146639611 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4) Nov 23 03:22:28 localhost podman[81454]: 2025-11-23 08:22:28.098609872 +0000 UTC m=+0.154305369 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:22:28 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:22:28 localhost podman[81456]: 2025-11-23 08:22:28.147088077 +0000 UTC m=+0.195075986 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:22:28 localhost podman[81456]: 2025-11-23 08:22:28.170156691 +0000 UTC m=+0.218144650 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc.) Nov 23 03:22:28 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:22:28 localhost podman[81455]: 2025-11-23 08:22:28.221511017 +0000 UTC m=+0.273437007 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:22:28 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:22:28 localhost podman[81462]: 2025-11-23 08:22:28.07316591 +0000 UTC m=+0.114664574 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team) Nov 23 03:22:28 localhost podman[81462]: 2025-11-23 08:22:28.42071628 +0000 UTC m=+0.462214974 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:22:28 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:22:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:22:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:22:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:22:31 localhost podman[81548]: 2025-11-23 08:22:31.020787826 +0000 UTC m=+0.079481367 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:22:31 localhost systemd[1]: tmp-crun.SZarGf.mount: Deactivated successfully. Nov 23 03:22:31 localhost podman[81550]: 2025-11-23 08:22:31.038131607 +0000 UTC m=+0.088967465 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller) Nov 23 03:22:31 localhost podman[81549]: 2025-11-23 08:22:31.081108756 +0000 UTC m=+0.133296012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, release=1761123044) Nov 23 03:22:31 localhost podman[81550]: 2025-11-23 08:22:31.133118862 +0000 UTC m=+0.183954750 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 23 03:22:31 localhost podman[81549]: 2025-11-23 08:22:31.140354847 +0000 UTC m=+0.192542143 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:22:31 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:22:31 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:22:31 localhost podman[81548]: 2025-11-23 08:22:31.198206859 +0000 UTC m=+0.256900430 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:22:31 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:22:55 localhost systemd[1]: tmp-crun.6eTDD8.mount: Deactivated successfully. Nov 23 03:22:55 localhost podman[81668]: 2025-11-23 08:22:55.036420334 +0000 UTC m=+0.087476004 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:22:55 localhost podman[81668]: 2025-11-23 08:22:55.048203896 +0000 UTC m=+0.099259546 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:22:55 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:22:55 localhost podman[81669]: 2025-11-23 08:22:55.134684219 +0000 UTC m=+0.181605765 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z) Nov 23 03:22:55 localhost podman[81669]: 2025-11-23 08:22:55.174300773 +0000 UTC m=+0.221222329 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:22:55 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:22:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:22:58 localhost systemd[1]: tmp-crun.mLA1dl.mount: Deactivated successfully. Nov 23 03:22:58 localhost podman[81709]: 2025-11-23 08:22:58.016424907 +0000 UTC m=+0.070982431 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.expose-services=) Nov 23 03:22:58 localhost podman[81709]: 2025-11-23 08:22:58.046176536 +0000 UTC m=+0.100734050 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc.) Nov 23 03:22:58 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:22:58 localhost sshd[81735]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:22:59 localhost systemd[1]: tmp-crun.027OmC.mount: Deactivated successfully. Nov 23 03:22:59 localhost podman[81738]: 2025-11-23 08:22:59.06517186 +0000 UTC m=+0.113863912 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 23 03:22:59 localhost podman[81739]: 2025-11-23 08:22:59.032100952 +0000 UTC m=+0.078162745 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:22:59 localhost podman[81736]: 2025-11-23 08:22:59.083082605 +0000 UTC m=+0.138000092 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z) Nov 23 03:22:59 localhost podman[81738]: 2025-11-23 08:22:59.116263426 +0000 UTC m=+0.164955468 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 23 03:22:59 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:22:59 localhost podman[81737]: 2025-11-23 08:22:59.128236243 +0000 UTC m=+0.178848082 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:22:59 localhost podman[81736]: 2025-11-23 08:22:59.167218438 +0000 UTC m=+0.222135855 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:22:59 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:22:59 localhost podman[81737]: 2025-11-23 08:22:59.178309179 +0000 UTC m=+0.228920998 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:22:59 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:22:59 localhost podman[81739]: 2025-11-23 08:22:59.402395582 +0000 UTC m=+0.448457405 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 23 03:22:59 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:23:00 localhost systemd[1]: tmp-crun.zM9UEk.mount: Deactivated successfully. Nov 23 03:23:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:23:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:23:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:23:02 localhost podman[81831]: 2025-11-23 08:23:02.028431463 +0000 UTC m=+0.083624258 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Nov 23 03:23:02 localhost podman[81830]: 2025-11-23 08:23:02.007474697 +0000 UTC m=+0.066708143 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:23:02 localhost podman[81832]: 2025-11-23 08:23:02.073477348 +0000 UTC m=+0.126923111 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:23:02 localhost podman[81831]: 2025-11-23 08:23:02.08522867 +0000 UTC m=+0.140421485 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044) Nov 23 03:23:02 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:23:02 localhost podman[81832]: 2025-11-23 08:23:02.098243208 +0000 UTC m=+0.151688971 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:23:02 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:23:02 localhost podman[81830]: 2025-11-23 08:23:02.189156304 +0000 UTC m=+0.248389730 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:23:02 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:23:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:23:05 localhost recover_tripleo_nova_virtqemud[81920]: 61756 Nov 23 03:23:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:23:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:23:05 localhost python3[81918]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:23:08 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 03:23:09 localhost rhsm-service[6595]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 23 03:23:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:23:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:23:26 localhost podman[82236]: 2025-11-23 08:23:26.011836279 +0000 UTC m=+0.065981542 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, build-date=2025-11-18T22:51:28Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:23:26 localhost podman[82236]: 2025-11-23 08:23:26.022189417 +0000 UTC m=+0.076334660 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:23:26 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:23:26 localhost podman[82237]: 2025-11-23 08:23:26.025049913 +0000 UTC m=+0.076735563 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:23:26 localhost podman[82237]: 2025-11-23 08:23:26.104734523 +0000 UTC m=+0.156420193 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, container_name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:23:26 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:23:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:23:29 localhost podman[82274]: 2025-11-23 08:23:29.032051603 +0000 UTC m=+0.087689951 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:23:29 localhost podman[82274]: 2025-11-23 08:23:29.063327907 +0000 UTC m=+0.118966265 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, config_id=tripleo_step5, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container) Nov 23 03:23:29 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:23:29 localhost podman[82302]: 2025-11-23 08:23:29.99492641 +0000 UTC m=+0.055213310 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:23:30 localhost podman[82302]: 2025-11-23 08:23:30.016867895 +0000 UTC m=+0.077154875 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:12:45Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:23:30 localhost systemd[1]: tmp-crun.cJzA1c.mount: Deactivated successfully. Nov 23 03:23:30 localhost podman[82303]: 2025-11-23 08:23:30.030243675 +0000 UTC m=+0.084564877 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:23:30 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:23:30 localhost podman[82301]: 2025-11-23 08:23:30.072073854 +0000 UTC m=+0.128860089 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:23:30 localhost podman[82301]: 2025-11-23 08:23:30.121204352 +0000 UTC m=+0.177990567 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12) Nov 23 03:23:30 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:23:30 localhost podman[82300]: 2025-11-23 08:23:30.139024574 +0000 UTC m=+0.198065687 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:23:30 localhost podman[82300]: 2025-11-23 08:23:30.148202238 +0000 UTC m=+0.207243391 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 23 03:23:30 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:23:30 localhost podman[82303]: 2025-11-23 08:23:30.39336177 +0000 UTC m=+0.447682962 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public) Nov 23 03:23:30 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:23:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:23:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:23:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:23:32 localhost systemd[1]: tmp-crun.LmGFK0.mount: Deactivated successfully. Nov 23 03:23:33 localhost systemd[1]: tmp-crun.PxXjXV.mount: Deactivated successfully. Nov 23 03:23:33 localhost podman[82391]: 2025-11-23 08:23:33.054734517 +0000 UTC m=+0.110649607 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:23:33 localhost podman[82393]: 2025-11-23 08:23:33.00564985 +0000 UTC m=+0.063819237 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:23:33 localhost podman[82392]: 2025-11-23 08:23:33.139352374 +0000 UTC m=+0.193265154 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:23:33 localhost podman[82392]: 2025-11-23 08:23:33.169576417 +0000 UTC m=+0.223489177 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team) Nov 23 03:23:33 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:23:33 localhost podman[82393]: 2025-11-23 08:23:33.186975776 +0000 UTC m=+0.245145163 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com) Nov 23 03:23:33 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:23:33 localhost podman[82391]: 2025-11-23 08:23:33.252497553 +0000 UTC m=+0.308412653 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1761123044, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:23:33 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:23:34 localhost python3[82480]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 23 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:23:57 localhost podman[82527]: 2025-11-23 08:23:57.031099102 +0000 UTC m=+0.081997620 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:23:57 localhost podman[82527]: 2025-11-23 08:23:57.042434651 +0000 UTC m=+0.093333109 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container) Nov 23 03:23:57 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:23:57 localhost podman[82526]: 2025-11-23 08:23:57.13045796 +0000 UTC m=+0.185727138 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 23 03:23:57 localhost podman[82526]: 2025-11-23 08:23:57.139973514 +0000 UTC m=+0.195242702 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:23:57 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:23:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:24:00 localhost podman[82564]: 2025-11-23 08:24:00.015920519 +0000 UTC m=+0.073045613 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:24:00 localhost podman[82564]: 2025-11-23 08:24:00.041728359 +0000 UTC m=+0.098853403 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.12) Nov 23 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:24:00 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:24:00 localhost podman[82590]: 2025-11-23 08:24:00.151961592 +0000 UTC m=+0.077011491 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:24:00 localhost podman[82590]: 2025-11-23 08:24:00.210258603 +0000 UTC m=+0.135308472 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:24:00 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:24:00 localhost podman[82624]: 2025-11-23 08:24:00.276944555 +0000 UTC m=+0.081310960 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12) Nov 23 03:24:00 localhost podman[82624]: 2025-11-23 08:24:00.314525507 +0000 UTC m=+0.118891942 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 23 03:24:00 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:24:00 localhost podman[82610]: 2025-11-23 08:24:00.357696207 +0000 UTC m=+0.188202952 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:24:00 localhost podman[82610]: 2025-11-23 08:24:00.390185977 +0000 UTC m=+0.220692722 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:24:00 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:24:01 localhost podman[82662]: 2025-11-23 08:24:01.019242625 +0000 UTC m=+0.075906958 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, tcib_managed=true) Nov 23 03:24:01 localhost podman[82662]: 2025-11-23 08:24:01.403280905 +0000 UTC m=+0.459945248 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 23 03:24:01 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:24:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:24:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:24:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:24:04 localhost podman[82685]: 2025-11-23 08:24:04.014427919 +0000 UTC m=+0.076927028 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:24:04 localhost systemd[1]: tmp-crun.2x1J0B.mount: Deactivated successfully. Nov 23 03:24:04 localhost podman[82686]: 2025-11-23 08:24:04.071958208 +0000 UTC m=+0.130107688 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 23 03:24:04 localhost podman[82687]: 2025-11-23 08:24:04.09781961 +0000 UTC m=+0.150535437 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:24:04 localhost podman[82686]: 2025-11-23 08:24:04.104075557 +0000 UTC m=+0.162225037 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:24:04 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:24:04 localhost podman[82687]: 2025-11-23 08:24:04.152284016 +0000 UTC m=+0.204999853 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Nov 23 03:24:04 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:24:04 localhost podman[82685]: 2025-11-23 08:24:04.192273681 +0000 UTC m=+0.254772840 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true) Nov 23 03:24:04 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:24:05 localhost systemd[1]: tmp-crun.mNEBVf.mount: Deactivated successfully. Nov 23 03:24:21 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:24:21 localhost recover_tripleo_nova_virtqemud[82773]: 61756 Nov 23 03:24:21 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:24:21 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:24:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:24:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:24:28 localhost podman[82838]: 2025-11-23 08:24:28.034694673 +0000 UTC m=+0.087470634 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:24:28 localhost podman[82838]: 2025-11-23 08:24:28.045287509 +0000 UTC m=+0.098063450 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:24:28 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:24:28 localhost systemd[1]: tmp-crun.hIH4GE.mount: Deactivated successfully. Nov 23 03:24:28 localhost podman[82837]: 2025-11-23 08:24:28.128181465 +0000 UTC m=+0.181171172 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, release=1761123044) Nov 23 03:24:28 localhost podman[82837]: 2025-11-23 08:24:28.136174614 +0000 UTC m=+0.189164251 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container) Nov 23 03:24:28 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:24:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:24:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:24:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:24:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:24:31 localhost systemd[1]: tmp-crun.b6iQna.mount: Deactivated successfully. Nov 23 03:24:31 localhost podman[82887]: 2025-11-23 08:24:31.070550163 +0000 UTC m=+0.115950444 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12) Nov 23 03:24:31 localhost podman[82878]: 2025-11-23 08:24:31.113472745 +0000 UTC m=+0.171608717 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Nov 23 03:24:31 localhost podman[82879]: 2025-11-23 08:24:31.04199116 +0000 UTC m=+0.093365349 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 23 03:24:31 localhost podman[82880]: 2025-11-23 08:24:31.100553669 +0000 UTC m=+0.150442605 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:24:31 localhost podman[82879]: 2025-11-23 08:24:31.176119036 +0000 UTC m=+0.227493235 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 23 03:24:31 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:24:31 localhost podman[82878]: 2025-11-23 08:24:31.197169395 +0000 UTC m=+0.255305367 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, managed_by=tripleo_ansible) Nov 23 03:24:31 localhost podman[82887]: 2025-11-23 08:24:31.2043927 +0000 UTC m=+0.249792961 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:24:31 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:24:31 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:24:31 localhost podman[82880]: 2025-11-23 08:24:31.23083261 +0000 UTC m=+0.280721576 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:24:31 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:24:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:24:31 localhost podman[82978]: 2025-11-23 08:24:31.990435467 +0000 UTC m=+0.051978814 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1) Nov 23 03:24:32 localhost podman[82978]: 2025-11-23 08:24:32.365376075 +0000 UTC m=+0.426919382 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 23 03:24:32 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:24:34 localhost systemd[1]: session-34.scope: Deactivated successfully. Nov 23 03:24:34 localhost systemd[1]: session-34.scope: Consumed 19.353s CPU time. Nov 23 03:24:34 localhost systemd-logind[761]: Session 34 logged out. Waiting for processes to exit. Nov 23 03:24:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:24:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:24:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:24:34 localhost systemd-logind[761]: Removed session 34. Nov 23 03:24:34 localhost systemd[1]: tmp-crun.WT0TN7.mount: Deactivated successfully. Nov 23 03:24:34 localhost podman[83002]: 2025-11-23 08:24:34.70796627 +0000 UTC m=+0.085917337 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:24:34 localhost systemd[1]: tmp-crun.FFAgr5.mount: Deactivated successfully. Nov 23 03:24:34 localhost podman[83001]: 2025-11-23 08:24:34.767003744 +0000 UTC m=+0.147356763 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc.) Nov 23 03:24:34 localhost podman[83003]: 2025-11-23 08:24:34.809114451 +0000 UTC m=+0.183498132 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12) Nov 23 03:24:34 localhost podman[83002]: 2025-11-23 08:24:34.837919421 +0000 UTC m=+0.215870448 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 03:24:34 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:24:34 localhost podman[83003]: 2025-11-23 08:24:34.859222578 +0000 UTC m=+0.233606179 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:24:34 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:24:34 localhost podman[83001]: 2025-11-23 08:24:34.950233596 +0000 UTC m=+0.330586565 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team) Nov 23 03:24:34 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:24:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:24:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:24:59 localhost podman[83125]: 2025-11-23 08:24:59.02831506 +0000 UTC m=+0.080693371 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 23 03:24:59 localhost podman[83125]: 2025-11-23 08:24:59.041272508 +0000 UTC m=+0.093650809 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com) Nov 23 03:24:59 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:24:59 localhost podman[83126]: 2025-11-23 08:24:59.132176522 +0000 UTC m=+0.184202292 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, release=1761123044, vcs-type=git, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:24:59 localhost podman[83126]: 2025-11-23 08:24:59.169358693 +0000 UTC m=+0.221384443 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:24:59 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:25:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:25:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:25:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:25:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:25:02 localhost podman[83168]: 2025-11-23 08:25:02.04968954 +0000 UTC m=+0.095003299 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4) Nov 23 03:25:02 localhost podman[83165]: 2025-11-23 08:25:02.086904801 +0000 UTC m=+0.141731335 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:25:02 localhost podman[83165]: 2025-11-23 08:25:02.094493178 +0000 UTC m=+0.149319762 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:25:02 localhost podman[83168]: 2025-11-23 08:25:02.105020952 +0000 UTC m=+0.150334671 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64) Nov 23 03:25:02 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:25:02 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:25:02 localhost podman[83167]: 2025-11-23 08:25:02.139402129 +0000 UTC m=+0.187866322 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi) Nov 23 03:25:02 localhost podman[83166]: 2025-11-23 08:25:02.19871303 +0000 UTC m=+0.249887144 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Nov 23 03:25:02 localhost podman[83167]: 2025-11-23 08:25:02.21679856 +0000 UTC m=+0.265262783 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 23 03:25:02 localhost podman[83166]: 2025-11-23 08:25:02.228730277 +0000 UTC m=+0.279904361 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:25:02 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:25:02 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:25:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:25:03 localhost podman[83269]: 2025-11-23 08:25:03.021258457 +0000 UTC m=+0.079889607 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute) Nov 23 03:25:03 localhost podman[83269]: 2025-11-23 08:25:03.413963186 +0000 UTC m=+0.472594376 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:25:03 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:25:05 localhost podman[83293]: 2025-11-23 08:25:05.006294612 +0000 UTC m=+0.068043353 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:25:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:25:05 localhost systemd[1]: tmp-crun.jFzhTM.mount: Deactivated successfully. Nov 23 03:25:05 localhost podman[83323]: 2025-11-23 08:25:05.094530558 +0000 UTC m=+0.065054115 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com) Nov 23 03:25:05 localhost podman[83294]: 2025-11-23 08:25:05.062415238 +0000 UTC m=+0.119320184 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git) Nov 23 03:25:05 localhost podman[83293]: 2025-11-23 08:25:05.119515054 +0000 UTC m=+0.181263735 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:25:05 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:25:05 localhost podman[83294]: 2025-11-23 08:25:05.147265423 +0000 UTC m=+0.204170409 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:25:05 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:25:05 localhost podman[83323]: 2025-11-23 08:25:05.290188712 +0000 UTC m=+0.260712229 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true) Nov 23 03:25:05 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:25:22 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:25:22 localhost recover_tripleo_nova_virtqemud[83383]: 61756 Nov 23 03:25:22 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:25:22 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:25:23 localhost podman[83471]: 2025-11-23 08:25:23.410498733 +0000 UTC m=+0.080060312 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 03:25:23 localhost podman[83471]: 2025-11-23 08:25:23.515631053 +0000 UTC m=+0.185192642 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7) Nov 23 03:25:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:25:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:25:30 localhost podman[83616]: 2025-11-23 08:25:30.018923086 +0000 UTC m=+0.075128055 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, url=https://www.redhat.com) Nov 23 03:25:30 localhost podman[83616]: 2025-11-23 08:25:30.059294771 +0000 UTC m=+0.115499740 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:25:30 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:25:30 localhost podman[83617]: 2025-11-23 08:25:30.075969769 +0000 UTC m=+0.132132447 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:25:30 localhost podman[83617]: 2025-11-23 08:25:30.084231726 +0000 UTC m=+0.140394404 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4) Nov 23 03:25:30 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:25:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:25:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:25:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:25:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:25:33 localhost systemd[1]: tmp-crun.QX6aIr.mount: Deactivated successfully. Nov 23 03:25:33 localhost podman[83656]: 2025-11-23 08:25:33.044930622 +0000 UTC m=+0.095884584 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 23 03:25:33 localhost podman[83656]: 2025-11-23 08:25:33.055052295 +0000 UTC m=+0.106006257 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-cron, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:25:33 localhost podman[83662]: 2025-11-23 08:25:33.065085874 +0000 UTC m=+0.106923144 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, version=17.1.12, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044) Nov 23 03:25:33 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:25:33 localhost podman[83662]: 2025-11-23 08:25:33.096199703 +0000 UTC m=+0.138037003 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, version=17.1.12) Nov 23 03:25:33 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:25:33 localhost podman[83658]: 2025-11-23 08:25:33.111172221 +0000 UTC m=+0.153932248 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Nov 23 03:25:33 localhost podman[83657]: 2025-11-23 08:25:33.161088511 +0000 UTC m=+0.207643492 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, distribution-scope=public) Nov 23 03:25:33 localhost podman[83657]: 2025-11-23 08:25:33.214663092 +0000 UTC m=+0.261218073 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:25:33 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:25:33 localhost podman[83658]: 2025-11-23 08:25:33.26615729 +0000 UTC m=+0.308917357 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64) Nov 23 03:25:33 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:25:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:25:34 localhost podman[83753]: 2025-11-23 08:25:34.004324336 +0000 UTC m=+0.067548279 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 23 03:25:34 localhost podman[83753]: 2025-11-23 08:25:34.380239094 +0000 UTC m=+0.443462987 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:25:34 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:25:36 localhost podman[83854]: 2025-11-23 08:25:36.010916307 +0000 UTC m=+0.066776666 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 23 03:25:36 localhost systemd[1]: tmp-crun.zrOZ8K.mount: Deactivated successfully. Nov 23 03:25:36 localhost podman[83855]: 2025-11-23 08:25:36.092472342 +0000 UTC m=+0.144499247 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible) Nov 23 03:25:36 localhost podman[83856]: 2025-11-23 08:25:36.170359929 +0000 UTC m=+0.218861548 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:25:36 localhost podman[83855]: 2025-11-23 08:25:36.202716125 +0000 UTC m=+0.254743010 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, io.openshift.expose-services=) Nov 23 03:25:36 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:25:36 localhost podman[83856]: 2025-11-23 08:25:36.214684893 +0000 UTC m=+0.263186542 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=) Nov 23 03:25:36 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:25:36 localhost podman[83854]: 2025-11-23 08:25:36.270812889 +0000 UTC m=+0.326673308 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible) Nov 23 03:25:36 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:25:41 localhost systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring. Nov 23 03:25:41 localhost systemd[1]: Created slice User Slice of UID 0. Nov 23 03:25:41 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 23 03:25:41 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 23 03:25:41 localhost systemd[1]: Starting User Manager for UID 0... Nov 23 03:25:41 localhost systemd[84232]: Queued start job for default target Main User Target. Nov 23 03:25:41 localhost systemd[84232]: Created slice User Application Slice. Nov 23 03:25:41 localhost systemd[84232]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 23 03:25:41 localhost systemd[84232]: Started Daily Cleanup of User's Temporary Directories. Nov 23 03:25:41 localhost systemd[84232]: Reached target Paths. Nov 23 03:25:41 localhost systemd[84232]: Reached target Timers. Nov 23 03:25:41 localhost systemd[84232]: Starting D-Bus User Message Bus Socket... Nov 23 03:25:41 localhost systemd[84232]: Starting Create User's Volatile Files and Directories... Nov 23 03:25:41 localhost systemd[84232]: Finished Create User's Volatile Files and Directories. Nov 23 03:25:41 localhost systemd[84232]: Listening on D-Bus User Message Bus Socket. Nov 23 03:25:41 localhost systemd[84232]: Reached target Sockets. Nov 23 03:25:41 localhost systemd[84232]: Reached target Basic System. Nov 23 03:25:41 localhost systemd[84232]: Reached target Main User Target. Nov 23 03:25:41 localhost systemd[84232]: Startup finished in 156ms. Nov 23 03:25:41 localhost systemd[1]: Started User Manager for UID 0. Nov 23 03:25:41 localhost systemd[1]: Started Session c11 of User root. Nov 23 03:25:43 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Nov 23 03:25:43 localhost kernel: device tapd3912d14-a3 entered promiscuous mode Nov 23 03:25:43 localhost NetworkManager[5975]: [1763886343.0359] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Nov 23 03:25:43 localhost systemd-udevd[84267]: Network interface NamePolicy= disabled on kernel command line. Nov 23 03:25:43 localhost NetworkManager[5975]: [1763886343.0556] device (tapd3912d14-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 03:25:43 localhost NetworkManager[5975]: [1763886343.0564] device (tapd3912d14-a3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 23 03:25:43 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Nov 23 03:25:43 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Nov 23 03:25:43 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Nov 23 03:25:43 localhost systemd-machined[84275]: New machine qemu-1-instance-00000002. Nov 23 03:25:43 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Nov 23 03:25:43 localhost NetworkManager[5975]: [1763886343.2965] manager: (tapbcac49fc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Nov 23 03:25:43 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c1: link becomes ready Nov 23 03:25:43 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c0: link becomes ready Nov 23 03:25:43 localhost NetworkManager[5975]: [1763886343.3419] device (tapbcac49fc-c0): carrier: link connected Nov 23 03:25:43 localhost kernel: device tapbcac49fc-c0 entered promiscuous mode Nov 23 03:25:45 localhost podman[84392]: 2025-11-23 08:25:45.049232611 +0000 UTC m=+0.089528275 container create 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:25:45 localhost systemd[1]: Started libpod-conmon-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope. Nov 23 03:25:45 localhost podman[84392]: 2025-11-23 08:25:45.008451933 +0000 UTC m=+0.048747667 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 23 03:25:45 localhost systemd[1]: Started libcrun container. Nov 23 03:25:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/086324888e3aef5fa52615eb760fec66e6f8cc0743a0416438a6f968d544d4e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 03:25:45 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 23 03:25:45 localhost podman[84392]: 2025-11-23 08:25:45.129681304 +0000 UTC m=+0.169976988 container init 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:25:45 localhost podman[84392]: 2025-11-23 08:25:45.139884628 +0000 UTC m=+0.180180292 container start 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1) Nov 23 03:25:45 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 23 03:25:45 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Nov 23 03:25:45 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Nov 23 03:25:46 localhost setroubleshoot[84409]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l a701375f-2314-41e7-b9ff-3bc5dfd0e157 Nov 23 03:25:46 localhost setroubleshoot[84409]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Nov 23 03:25:54 localhost snmpd[67457]: empty variable list in _query Nov 23 03:25:54 localhost snmpd[67457]: empty variable list in _query Nov 23 03:25:55 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Nov 23 03:25:56 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 23 03:26:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:26:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:26:01 localhost systemd[1]: tmp-crun.NFmG9s.mount: Deactivated successfully. Nov 23 03:26:01 localhost podman[84475]: 2025-11-23 08:26:01.046982862 +0000 UTC m=+0.094931197 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, container_name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc.) Nov 23 03:26:01 localhost podman[84475]: 2025-11-23 08:26:01.06333273 +0000 UTC m=+0.111281145 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team) Nov 23 03:26:01 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:26:01 localhost podman[84476]: 2025-11-23 08:26:01.138794294 +0000 UTC m=+0.185916284 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z) Nov 23 03:26:01 localhost podman[84476]: 2025-11-23 08:26:01.179686085 +0000 UTC m=+0.226808135 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044) Nov 23 03:26:01 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48596 [23/Nov/2025:08:26:01.431] listener listener/metadata 0/0/0/1122/1122 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48600 [23/Nov/2025:08:26:02.645] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48602 [23/Nov/2025:08:26:02.708] listener listener/metadata 0/0/0/10/10 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48606 [23/Nov/2025:08:26:02.760] listener listener/metadata 0/0/0/9/9 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48612 [23/Nov/2025:08:26:02.812] listener listener/metadata 0/0/0/8/8 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48620 [23/Nov/2025:08:26:02.865] listener listener/metadata 0/0/0/11/11 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48622 [23/Nov/2025:08:26:02.919] listener listener/metadata 0/0/0/10/10 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 23 03:26:02 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48628 [23/Nov/2025:08:26:02.971] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48638 [23/Nov/2025:08:26:03.026] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48642 [23/Nov/2025:08:26:03.084] listener listener/metadata 0/0/0/11/11 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48658 [23/Nov/2025:08:26:03.142] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48664 [23/Nov/2025:08:26:03.185] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48680 [23/Nov/2025:08:26:03.229] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48692 [23/Nov/2025:08:26:03.268] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48704 [23/Nov/2025:08:26:03.320] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 23 03:26:03 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[84414]: 192.168.0.77:48714 [23/Nov/2025:08:26:03.377] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 23 03:26:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:26:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:26:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:26:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:26:04 localhost systemd[1]: tmp-crun.4Bv17O.mount: Deactivated successfully. Nov 23 03:26:04 localhost systemd[1]: tmp-crun.B2KBtL.mount: Deactivated successfully. Nov 23 03:26:04 localhost podman[84514]: 2025-11-23 08:26:04.018185452 +0000 UTC m=+0.068634601 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 23 03:26:04 localhost podman[84512]: 2025-11-23 08:26:04.045293262 +0000 UTC m=+0.098065230 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 23 03:26:04 localhost podman[84514]: 2025-11-23 08:26:04.102324575 +0000 UTC m=+0.152773744 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public) Nov 23 03:26:04 localhost podman[84515]: 2025-11-23 08:26:04.111067397 +0000 UTC m=+0.151761635 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_compute, release=1761123044) Nov 23 03:26:04 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:26:04 localhost podman[84512]: 2025-11-23 08:26:04.131435915 +0000 UTC m=+0.184207883 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond) Nov 23 03:26:04 localhost podman[84515]: 2025-11-23 08:26:04.133420434 +0000 UTC m=+0.174114682 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:26:04 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:26:04 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:26:04 localhost podman[84513]: 2025-11-23 08:26:04.082925716 +0000 UTC m=+0.134213789 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com) Nov 23 03:26:04 localhost podman[84513]: 2025-11-23 08:26:04.213131244 +0000 UTC m=+0.264419357 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:26:04 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:26:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:26:05 localhost podman[84606]: 2025-11-23 08:26:05.012486638 +0000 UTC m=+0.073859656 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 23 03:26:05 localhost podman[84606]: 2025-11-23 08:26:05.408428064 +0000 UTC m=+0.469801042 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:26:05 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:26:07 localhost systemd[1]: tmp-crun.m6kbau.mount: Deactivated successfully. Nov 23 03:26:07 localhost podman[84630]: 2025-11-23 08:26:07.009126712 +0000 UTC m=+0.070108175 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 23 03:26:07 localhost podman[84630]: 2025-11-23 08:26:07.053151707 +0000 UTC m=+0.114133150 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 03:26:07 localhost systemd[1]: tmp-crun.6eHnDr.mount: Deactivated successfully. Nov 23 03:26:07 localhost podman[84631]: 2025-11-23 08:26:07.065114614 +0000 UTC m=+0.120001575 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller) Nov 23 03:26:07 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:26:07 localhost podman[84631]: 2025-11-23 08:26:07.11149826 +0000 UTC m=+0.166385031 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_controller) Nov 23 03:26:07 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:26:07 localhost podman[84629]: 2025-11-23 08:26:07.115442547 +0000 UTC m=+0.177588944 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:26:07 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 23 03:26:07 localhost podman[84629]: 2025-11-23 08:26:07.304282467 +0000 UTC m=+0.366428864 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 23 03:26:07 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:26:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:26:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:26:32 localhost podman[84783]: 2025-11-23 08:26:32.043296088 +0000 UTC m=+0.061840007 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-type=git, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:26:32 localhost podman[84784]: 2025-11-23 08:26:32.100300182 +0000 UTC m=+0.117139610 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, batch=17.1_20251118.1) Nov 23 03:26:32 localhost podman[84784]: 2025-11-23 08:26:32.112190026 +0000 UTC m=+0.129029464 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible) Nov 23 03:26:32 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:26:32 localhost podman[84783]: 2025-11-23 08:26:32.132177713 +0000 UTC m=+0.150721712 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4) Nov 23 03:26:32 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:26:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:26:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:26:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:26:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:26:35 localhost podman[84825]: 2025-11-23 08:26:35.022602051 +0000 UTC m=+0.073892178 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:26:35 localhost podman[84823]: 2025-11-23 08:26:35.003172481 +0000 UTC m=+0.061055635 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:26:35 localhost podman[84824]: 2025-11-23 08:26:35.064255485 +0000 UTC m=+0.118101368 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 23 03:26:35 localhost podman[84825]: 2025-11-23 08:26:35.067870053 +0000 UTC m=+0.119160220 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true) Nov 23 03:26:35 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:26:35 localhost podman[84831]: 2025-11-23 08:26:35.115789864 +0000 UTC m=+0.162855555 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, vcs-type=git, container_name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:26:35 localhost podman[84824]: 2025-11-23 08:26:35.135361398 +0000 UTC m=+0.189207281 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:26:35 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:26:35 localhost podman[84831]: 2025-11-23 08:26:35.167439727 +0000 UTC m=+0.214505398 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z) Nov 23 03:26:35 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:26:35 localhost podman[84823]: 2025-11-23 08:26:35.186477885 +0000 UTC m=+0.244361089 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible) Nov 23 03:26:35 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:26:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:26:36 localhost systemd[1]: tmp-crun.lUHERN.mount: Deactivated successfully. Nov 23 03:26:36 localhost podman[84924]: 2025-11-23 08:26:36.020217167 +0000 UTC m=+0.076771824 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_migration_target, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4) Nov 23 03:26:36 localhost podman[84924]: 2025-11-23 08:26:36.41440729 +0000 UTC m=+0.470961937 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=) Nov 23 03:26:36 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:26:38 localhost systemd[1]: tmp-crun.rZVE0d.mount: Deactivated successfully. Nov 23 03:26:38 localhost podman[84947]: 2025-11-23 08:26:38.010627475 +0000 UTC m=+0.072355793 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:26:38 localhost podman[84949]: 2025-11-23 08:26:38.019775928 +0000 UTC m=+0.071425725 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller) Nov 23 03:26:38 localhost podman[84949]: 2025-11-23 08:26:38.094627954 +0000 UTC m=+0.146277731 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, tcib_managed=true) Nov 23 03:26:38 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:26:38 localhost podman[84948]: 2025-11-23 08:26:38.104715124 +0000 UTC m=+0.163466153 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Nov 23 03:26:38 localhost podman[84948]: 2025-11-23 08:26:38.137184964 +0000 UTC m=+0.195935933 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Nov 23 03:26:38 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:26:38 localhost podman[84947]: 2025-11-23 08:26:38.211208245 +0000 UTC m=+0.272936633 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:26:38 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:26:46 localhost sshd[85021]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:27:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:27:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:27:03 localhost podman[85070]: 2025-11-23 08:27:03.023039818 +0000 UTC m=+0.075679418 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, version=17.1.12) Nov 23 03:27:03 localhost podman[85069]: 2025-11-23 08:27:03.090962858 +0000 UTC m=+0.143029780 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:27:03 localhost podman[85069]: 2025-11-23 08:27:03.103508267 +0000 UTC m=+0.155575199 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:27:03 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:27:03 localhost podman[85070]: 2025-11-23 08:27:03.158223737 +0000 UTC m=+0.210863337 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:27:03 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:27:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:27:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:27:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:27:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:27:06 localhost systemd[1]: tmp-crun.NkWxa4.mount: Deactivated successfully. Nov 23 03:27:06 localhost podman[85109]: 2025-11-23 08:27:06.035224782 +0000 UTC m=+0.090651294 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, url=https://www.redhat.com, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, vcs-type=git) Nov 23 03:27:06 localhost podman[85116]: 2025-11-23 08:27:06.046184011 +0000 UTC m=+0.091091408 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64) Nov 23 03:27:06 localhost podman[85116]: 2025-11-23 08:27:06.071348351 +0000 UTC m=+0.116255798 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 23 03:27:06 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:27:06 localhost podman[85111]: 2025-11-23 08:27:06.088112904 +0000 UTC m=+0.136860244 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044) Nov 23 03:27:06 localhost podman[85110]: 2025-11-23 08:27:06.125164403 +0000 UTC m=+0.176971680 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:27:06 localhost podman[85111]: 2025-11-23 08:27:06.140221762 +0000 UTC m=+0.188969112 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com) Nov 23 03:27:06 localhost podman[85109]: 2025-11-23 08:27:06.14863013 +0000 UTC m=+0.204056592 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 23 03:27:06 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:27:06 localhost podman[85110]: 2025-11-23 08:27:06.156326805 +0000 UTC m=+0.208134052 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:27:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:27:06 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:27:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:27:07 localhost podman[85200]: 2025-11-23 08:27:07.01829668 +0000 UTC m=+0.079839860 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:27:07 localhost podman[85200]: 2025-11-23 08:27:07.387261525 +0000 UTC m=+0.448804605 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4) Nov 23 03:27:07 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:27:09 localhost podman[85222]: 2025-11-23 08:27:09.033163165 +0000 UTC m=+0.087453873 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:27:09 localhost podman[85223]: 2025-11-23 08:27:09.078672653 +0000 UTC m=+0.129702017 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:27:09 localhost podman[85223]: 2025-11-23 08:27:09.122215947 +0000 UTC m=+0.173245331 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:27:09 localhost podman[85224]: 2025-11-23 08:27:09.132132003 +0000 UTC m=+0.180581855 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:27:09 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:27:09 localhost podman[85224]: 2025-11-23 08:27:09.178157627 +0000 UTC m=+0.226607449 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64) Nov 23 03:27:09 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:27:09 localhost podman[85222]: 2025-11-23 08:27:09.235221252 +0000 UTC m=+0.289511900 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Nov 23 03:27:09 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:27:26 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:27:26 localhost recover_tripleo_nova_virtqemud[85300]: 61756 Nov 23 03:27:26 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:27:26 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:27:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:27:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:27:34 localhost systemd[1]: tmp-crun.FD9ryO.mount: Deactivated successfully. Nov 23 03:27:34 localhost podman[85378]: 2025-11-23 08:27:34.044776944 +0000 UTC m=+0.096086138 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container) Nov 23 03:27:34 localhost podman[85379]: 2025-11-23 08:27:34.086171501 +0000 UTC m=+0.135420379 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 23 03:27:34 localhost podman[85379]: 2025-11-23 08:27:34.096878571 +0000 UTC m=+0.146127429 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Nov 23 03:27:34 localhost podman[85378]: 2025-11-23 08:27:34.105853877 +0000 UTC m=+0.157163081 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container) Nov 23 03:27:34 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:27:34 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:27:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:27:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:27:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:27:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:27:37 localhost podman[85418]: 2025-11-23 08:27:37.041348971 +0000 UTC m=+0.081756871 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4) Nov 23 03:27:37 localhost podman[85418]: 2025-11-23 08:27:37.076593793 +0000 UTC m=+0.117001713 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4) Nov 23 03:27:37 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:27:37 localhost podman[85417]: 2025-11-23 08:27:37.100011217 +0000 UTC m=+0.145351834 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute) Nov 23 03:27:37 localhost podman[85417]: 2025-11-23 08:27:37.141752945 +0000 UTC m=+0.187093592 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:27:37 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:27:37 localhost podman[85416]: 2025-11-23 08:27:37.160199672 +0000 UTC m=+0.208574866 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, release=1761123044) Nov 23 03:27:37 localhost podman[85416]: 2025-11-23 08:27:37.179155005 +0000 UTC m=+0.227530199 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 23 03:27:37 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:27:37 localhost podman[85422]: 2025-11-23 08:27:37.25132253 +0000 UTC m=+0.289610252 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:27:37 localhost podman[85422]: 2025-11-23 08:27:37.284290658 +0000 UTC m=+0.322578400 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 03:27:37 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:27:38 localhost podman[85515]: 2025-11-23 08:27:38.018557222 +0000 UTC m=+0.076476833 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:27:38 localhost podman[85515]: 2025-11-23 08:27:38.417324776 +0000 UTC m=+0.475244327 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1) Nov 23 03:27:38 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:27:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:27:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:27:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:27:40 localhost systemd[1]: tmp-crun.kqyZSe.mount: Deactivated successfully. Nov 23 03:27:40 localhost podman[85540]: 2025-11-23 08:27:40.047594428 +0000 UTC m=+0.088195627 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:27:40 localhost podman[85538]: 2025-11-23 08:27:40.085953967 +0000 UTC m=+0.132650359 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:27:40 localhost podman[85540]: 2025-11-23 08:27:40.099291462 +0000 UTC m=+0.139892631 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12) Nov 23 03:27:40 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:27:40 localhost podman[85539]: 2025-11-23 08:27:40.143362184 +0000 UTC m=+0.184947014 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:27:40 localhost podman[85539]: 2025-11-23 08:27:40.223229533 +0000 UTC m=+0.264814363 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044) Nov 23 03:27:40 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:27:40 localhost podman[85538]: 2025-11-23 08:27:40.300551593 +0000 UTC m=+0.347247965 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 23 03:27:40 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:27:59 localhost sshd[85660]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:28:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:28:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:28:05 localhost podman[85662]: 2025-11-23 08:28:05.025832278 +0000 UTC m=+0.084222269 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1) Nov 23 03:28:05 localhost podman[85662]: 2025-11-23 08:28:05.035452774 +0000 UTC m=+0.093842785 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:28:05 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:28:05 localhost systemd[1]: tmp-crun.PSRpGX.mount: Deactivated successfully. Nov 23 03:28:05 localhost podman[85663]: 2025-11-23 08:28:05.141098024 +0000 UTC m=+0.197543143 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=) Nov 23 03:28:05 localhost podman[85663]: 2025-11-23 08:28:05.152259019 +0000 UTC m=+0.208704098 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container) Nov 23 03:28:05 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:28:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:28:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:28:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:28:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:28:08 localhost podman[85700]: 2025-11-23 08:28:08.030707751 +0000 UTC m=+0.080963617 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:28:08 localhost podman[85700]: 2025-11-23 08:28:08.043424905 +0000 UTC m=+0.093680751 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12) Nov 23 03:28:08 localhost podman[85706]: 2025-11-23 08:28:08.090303546 +0000 UTC m=+0.132162704 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container) Nov 23 03:28:08 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:28:08 localhost podman[85706]: 2025-11-23 08:28:08.122176719 +0000 UTC m=+0.164035877 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 23 03:28:08 localhost podman[85701]: 2025-11-23 08:28:08.073608085 +0000 UTC m=+0.126156764 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git) Nov 23 03:28:08 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:28:08 localhost podman[85702]: 2025-11-23 08:28:08.190053048 +0000 UTC m=+0.235516922 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 23 03:28:08 localhost podman[85701]: 2025-11-23 08:28:08.209077573 +0000 UTC m=+0.261626252 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:28:08 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:28:08 localhost podman[85702]: 2025-11-23 08:28:08.228209612 +0000 UTC m=+0.273673476 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4) Nov 23 03:28:08 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:28:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:28:09 localhost podman[85795]: 2025-11-23 08:28:09.027091091 +0000 UTC m=+0.084670434 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 23 03:28:09 localhost podman[85795]: 2025-11-23 08:28:09.39334484 +0000 UTC m=+0.450924233 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Nov 23 03:28:09 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:28:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:28:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:28:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:28:11 localhost systemd[1]: tmp-crun.ftwYuc.mount: Deactivated successfully. Nov 23 03:28:11 localhost podman[85819]: 2025-11-23 08:28:11.028133665 +0000 UTC m=+0.081973858 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1) Nov 23 03:28:11 localhost systemd[1]: tmp-crun.tyO4QI.mount: Deactivated successfully. Nov 23 03:28:11 localhost podman[85820]: 2025-11-23 08:28:11.079426247 +0000 UTC m=+0.128238070 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-type=git) Nov 23 03:28:11 localhost podman[85818]: 2025-11-23 08:28:11.132063562 +0000 UTC m=+0.184948214 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:28:11 localhost podman[85819]: 2025-11-23 08:28:11.153612026 +0000 UTC m=+0.207452339 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64) Nov 23 03:28:11 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:28:11 localhost podman[85820]: 2025-11-23 08:28:11.184586641 +0000 UTC m=+0.233398524 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z) Nov 23 03:28:11 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:28:11 localhost podman[85818]: 2025-11-23 08:28:11.333341763 +0000 UTC m=+0.386226445 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:28:11 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:28:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:28:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:28:36 localhost systemd[1]: tmp-crun.57wHdX.mount: Deactivated successfully. Nov 23 03:28:36 localhost podman[85971]: 2025-11-23 08:28:36.052960686 +0000 UTC m=+0.103904865 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:28:36 localhost podman[85972]: 2025-11-23 08:28:36.093718853 +0000 UTC m=+0.142954078 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Nov 23 03:28:36 localhost podman[85971]: 2025-11-23 08:28:36.115690162 +0000 UTC m=+0.166634391 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git) Nov 23 03:28:36 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:28:36 localhost podman[85972]: 2025-11-23 08:28:36.129474451 +0000 UTC m=+0.178709716 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, tcib_managed=true, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:28:36 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:28:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:28:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:28:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:28:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:28:39 localhost systemd[1]: tmp-crun.kcDgjR.mount: Deactivated successfully. Nov 23 03:28:39 localhost podman[86010]: 2025-11-23 08:28:39.021475492 +0000 UTC m=+0.078728345 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:28:39 localhost podman[86015]: 2025-11-23 08:28:39.04120064 +0000 UTC m=+0.085984076 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public) Nov 23 03:28:39 localhost systemd[1]: tmp-crun.i2TqBp.mount: Deactivated successfully. Nov 23 03:28:39 localhost podman[86018]: 2025-11-23 08:28:39.086405807 +0000 UTC m=+0.132220656 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 23 03:28:39 localhost podman[86011]: 2025-11-23 08:28:39.099538255 +0000 UTC m=+0.147255105 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:28:39 localhost podman[86018]: 2025-11-23 08:28:39.109184732 +0000 UTC m=+0.154999551 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:28:39 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:28:39 localhost podman[86015]: 2025-11-23 08:28:39.128214737 +0000 UTC m=+0.172998193 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:12:45Z, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:28:39 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:28:39 localhost podman[86011]: 2025-11-23 08:28:39.151226269 +0000 UTC m=+0.198943129 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Nov 23 03:28:39 localhost podman[86010]: 2025-11-23 08:28:39.163360645 +0000 UTC m=+0.220613508 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, container_name=logrotate_crond, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:28:39 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:28:39 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:28:40 localhost podman[86105]: 2025-11-23 08:28:40.031529508 +0000 UTC m=+0.087775173 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 23 03:28:40 localhost podman[86105]: 2025-11-23 08:28:40.404332956 +0000 UTC m=+0.460578601 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:28:40 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:28:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:28:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:28:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:28:42 localhost systemd[1]: tmp-crun.uzmJtn.mount: Deactivated successfully. Nov 23 03:28:42 localhost podman[86129]: 2025-11-23 08:28:42.027059618 +0000 UTC m=+0.080842573 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible) Nov 23 03:28:42 localhost podman[86128]: 2025-11-23 08:28:42.075359024 +0000 UTC m=+0.130552053 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:28:42 localhost podman[86129]: 2025-11-23 08:28:42.095142673 +0000 UTC m=+0.148925638 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:28:42 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:28:42 localhost podman[86130]: 2025-11-23 08:28:42.180686184 +0000 UTC m=+0.228574181 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:28:42 localhost podman[86130]: 2025-11-23 08:28:42.195690921 +0000 UTC m=+0.243578888 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:28:42 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:28:42 localhost podman[86128]: 2025-11-23 08:28:42.319969424 +0000 UTC m=+0.375162393 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:28:42 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:29:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:29:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:29:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:29:06 localhost recover_tripleo_nova_virtqemud[86262]: 61756 Nov 23 03:29:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:29:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:29:07 localhost systemd[1]: tmp-crun.Z8Ddif.mount: Deactivated successfully. Nov 23 03:29:07 localhost podman[86249]: 2025-11-23 08:29:07.02353872 +0000 UTC m=+0.079554222 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Nov 23 03:29:07 localhost podman[86249]: 2025-11-23 08:29:07.036205082 +0000 UTC m=+0.092220564 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64) Nov 23 03:29:07 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:29:07 localhost podman[86250]: 2025-11-23 08:29:07.120331358 +0000 UTC m=+0.172491357 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.openshift.expose-services=) Nov 23 03:29:07 localhost podman[86250]: 2025-11-23 08:29:07.13013808 +0000 UTC m=+0.182298029 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12) Nov 23 03:29:07 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:29:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:29:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:29:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:29:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:29:10 localhost systemd[1]: tmp-crun.SW359e.mount: Deactivated successfully. Nov 23 03:29:10 localhost podman[86291]: 2025-11-23 08:29:10.04972829 +0000 UTC m=+0.104090982 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, release=1761123044, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:29:10 localhost systemd[1]: tmp-crun.J5vcdh.mount: Deactivated successfully. Nov 23 03:29:10 localhost podman[86291]: 2025-11-23 08:29:10.134430454 +0000 UTC m=+0.188793186 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., distribution-scope=public, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 23 03:29:10 localhost podman[86294]: 2025-11-23 08:29:10.083449522 +0000 UTC m=+0.129669785 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 23 03:29:10 localhost podman[86292]: 2025-11-23 08:29:10.145510786 +0000 UTC m=+0.197398009 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:29:10 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:29:10 localhost podman[86293]: 2025-11-23 08:29:10.112782336 +0000 UTC m=+0.161429486 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 23 03:29:10 localhost podman[86292]: 2025-11-23 08:29:10.182673539 +0000 UTC m=+0.234560772 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, version=17.1.12, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:29:10 localhost podman[86293]: 2025-11-23 08:29:10.192572853 +0000 UTC m=+0.241219973 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4) Nov 23 03:29:10 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:29:10 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:29:10 localhost podman[86294]: 2025-11-23 08:29:10.265119631 +0000 UTC m=+0.311339974 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 23 03:29:10 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:29:11 localhost podman[86386]: 2025-11-23 08:29:11.063291567 +0000 UTC m=+0.083974202 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z) Nov 23 03:29:11 localhost podman[86386]: 2025-11-23 08:29:11.434432181 +0000 UTC m=+0.455114836 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:29:11 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:29:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:29:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:29:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:29:13 localhost podman[86409]: 2025-11-23 08:29:13.018441132 +0000 UTC m=+0.071495375 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:29:13 localhost podman[86407]: 2025-11-23 08:29:13.070799268 +0000 UTC m=+0.127268440 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:29:13 localhost podman[86409]: 2025-11-23 08:29:13.091185576 +0000 UTC m=+0.144239879 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:29:13 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:29:13 localhost podman[86408]: 2025-11-23 08:29:13.167328047 +0000 UTC m=+0.220368280 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044) Nov 23 03:29:13 localhost podman[86408]: 2025-11-23 08:29:13.203154037 +0000 UTC m=+0.256194260 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:29:13 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:29:13 localhost podman[86407]: 2025-11-23 08:29:13.284399311 +0000 UTC m=+0.340868513 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:29:13 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:29:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:29:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:29:38 localhost podman[86558]: 2025-11-23 08:29:38.034753402 +0000 UTC m=+0.087890407 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Nov 23 03:29:38 localhost podman[86558]: 2025-11-23 08:29:38.044989207 +0000 UTC m=+0.098126222 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, name=rhosp17/openstack-collectd, version=17.1.12, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, architecture=x86_64) Nov 23 03:29:38 localhost systemd[1]: tmp-crun.OcE2eg.mount: Deactivated successfully. Nov 23 03:29:38 localhost podman[86559]: 2025-11-23 08:29:38.087148408 +0000 UTC m=+0.138216327 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:29:38 localhost podman[86559]: 2025-11-23 08:29:38.096257119 +0000 UTC m=+0.147324978 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc.) Nov 23 03:29:38 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:29:38 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:29:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:29:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:29:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:29:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:29:41 localhost systemd[1]: tmp-crun.ICvGG9.mount: Deactivated successfully. Nov 23 03:29:41 localhost podman[86599]: 2025-11-23 08:29:41.03224067 +0000 UTC m=+0.084135217 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 23 03:29:41 localhost podman[86598]: 2025-11-23 08:29:41.075698902 +0000 UTC m=+0.129498810 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044) Nov 23 03:29:41 localhost podman[86599]: 2025-11-23 08:29:41.081218687 +0000 UTC m=+0.133113214 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, release=1761123044, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z) Nov 23 03:29:41 localhost podman[86598]: 2025-11-23 08:29:41.088994225 +0000 UTC m=+0.142794133 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=) Nov 23 03:29:41 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:29:41 localhost podman[86606]: 2025-11-23 08:29:41.048063333 +0000 UTC m=+0.090439227 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Nov 23 03:29:41 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:29:41 localhost podman[86606]: 2025-11-23 08:29:41.132356924 +0000 UTC m=+0.174732858 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 23 03:29:41 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:29:41 localhost podman[86600]: 2025-11-23 08:29:41.188955425 +0000 UTC m=+0.237889838 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 23 03:29:41 localhost podman[86600]: 2025-11-23 08:29:41.218397131 +0000 UTC m=+0.267331624 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 23 03:29:41 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:29:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:29:42 localhost systemd[1]: tmp-crun.WK1jwp.mount: Deactivated successfully. Nov 23 03:29:42 localhost podman[86693]: 2025-11-23 08:29:42.024497209 +0000 UTC m=+0.082257126 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true) Nov 23 03:29:42 localhost podman[86693]: 2025-11-23 08:29:42.405645672 +0000 UTC m=+0.463405599 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com) Nov 23 03:29:42 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:29:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:29:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:29:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:29:44 localhost podman[86713]: 2025-11-23 08:29:44.033804337 +0000 UTC m=+0.087677059 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 23 03:29:44 localhost systemd[1]: tmp-crun.BmG7sU.mount: Deactivated successfully. Nov 23 03:29:44 localhost podman[86714]: 2025-11-23 08:29:44.104167765 +0000 UTC m=+0.153770622 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=) Nov 23 03:29:44 localhost podman[86714]: 2025-11-23 08:29:44.149247278 +0000 UTC m=+0.198850005 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Nov 23 03:29:44 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:29:44 localhost podman[86715]: 2025-11-23 08:29:44.203954109 +0000 UTC m=+0.252476781 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:29:44 localhost podman[86713]: 2025-11-23 08:29:44.225232096 +0000 UTC m=+0.279104818 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:29:44 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:29:44 localhost podman[86715]: 2025-11-23 08:29:44.279680477 +0000 UTC m=+0.328203189 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z) Nov 23 03:29:44 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:29:45 localhost systemd[1]: tmp-crun.o6A1LZ.mount: Deactivated successfully. Nov 23 03:30:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:30:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:30:09 localhost podman[86833]: 2025-11-23 08:30:09.024827212 +0000 UTC m=+0.080684887 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:30:09 localhost podman[86833]: 2025-11-23 08:30:09.035195302 +0000 UTC m=+0.091052947 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Nov 23 03:30:09 localhost podman[86834]: 2025-11-23 08:30:08.992044899 +0000 UTC m=+0.052871291 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.12) Nov 23 03:30:09 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:30:09 localhost podman[86834]: 2025-11-23 08:30:09.074228804 +0000 UTC m=+0.135055196 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z) Nov 23 03:30:09 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:30:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:30:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:30:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:30:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:30:12 localhost systemd[1]: tmp-crun.RhdQvq.mount: Deactivated successfully. Nov 23 03:30:12 localhost podman[86873]: 2025-11-23 08:30:12.029830808 +0000 UTC m=+0.078400194 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:30:12 localhost podman[86883]: 2025-11-23 08:30:12.058030805 +0000 UTC m=+0.104936909 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:30:12 localhost podman[86883]: 2025-11-23 08:30:12.078170766 +0000 UTC m=+0.125076880 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 23 03:30:12 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:30:12 localhost podman[86871]: 2025-11-23 08:30:12.009640506 +0000 UTC m=+0.067299731 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, version=17.1.12, vcs-type=git) Nov 23 03:30:12 localhost podman[86873]: 2025-11-23 08:30:12.128642441 +0000 UTC m=+0.177211807 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 23 03:30:12 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:30:12 localhost podman[86871]: 2025-11-23 08:30:12.138277168 +0000 UTC m=+0.195936363 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-cron-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 23 03:30:12 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:30:12 localhost podman[86872]: 2025-11-23 08:30:12.201372064 +0000 UTC m=+0.255790726 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, vcs-type=git) Nov 23 03:30:12 localhost podman[86872]: 2025-11-23 08:30:12.224170969 +0000 UTC m=+0.278589651 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:30:12 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:30:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:30:13 localhost podman[86968]: 2025-11-23 08:30:13.017416509 +0000 UTC m=+0.078870430 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 23 03:30:13 localhost podman[86968]: 2025-11-23 08:30:13.362358 +0000 UTC m=+0.423811911 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target) Nov 23 03:30:13 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:30:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:30:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:30:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:30:15 localhost systemd[1]: tmp-crun.4vfuL6.mount: Deactivated successfully. Nov 23 03:30:15 localhost systemd[1]: tmp-crun.U3xnrA.mount: Deactivated successfully. Nov 23 03:30:15 localhost podman[86989]: 2025-11-23 08:30:15.082205661 +0000 UTC m=+0.139583770 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12) Nov 23 03:30:15 localhost podman[86991]: 2025-11-23 08:30:15.129280648 +0000 UTC m=+0.180552433 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1761123044, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:30:15 localhost podman[86990]: 2025-11-23 08:30:15.047952841 +0000 UTC m=+0.102691636 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:30:15 localhost podman[86991]: 2025-11-23 08:30:15.177362707 +0000 UTC m=+0.228634492 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044) Nov 23 03:30:15 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:30:15 localhost podman[86990]: 2025-11-23 08:30:15.22929927 +0000 UTC m=+0.284038055 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:30:15 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:30:15 localhost podman[86989]: 2025-11-23 08:30:15.28686064 +0000 UTC m=+0.344238669 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:30:15 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:30:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:30:36 localhost recover_tripleo_nova_virtqemud[87143]: 61756 Nov 23 03:30:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:30:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:30:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:30:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:30:40 localhost systemd[1]: tmp-crun.4E3HNn.mount: Deactivated successfully. Nov 23 03:30:40 localhost podman[87145]: 2025-11-23 08:30:40.083287495 +0000 UTC m=+0.137964038 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, version=17.1.12, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:30:40 localhost podman[87145]: 2025-11-23 08:30:40.097314942 +0000 UTC m=+0.151991555 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:30:40 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:30:40 localhost podman[87144]: 2025-11-23 08:30:40.084349699 +0000 UTC m=+0.138648631 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git) Nov 23 03:30:40 localhost podman[87144]: 2025-11-23 08:30:40.164250681 +0000 UTC m=+0.218549653 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:30:40 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:30:43 localhost systemd[84232]: Created slice User Background Tasks Slice. Nov 23 03:30:43 localhost systemd[84232]: Starting Cleanup of User's Temporary Files and Directories... Nov 23 03:30:43 localhost systemd[1]: tmp-crun.wBmeyO.mount: Deactivated successfully. Nov 23 03:30:43 localhost systemd[84232]: Finished Cleanup of User's Temporary Files and Directories. Nov 23 03:30:43 localhost podman[87183]: 2025-11-23 08:30:43.017131478 +0000 UTC m=+0.073467747 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:30:43 localhost podman[87185]: 2025-11-23 08:30:43.046870994 +0000 UTC m=+0.099250078 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi) Nov 23 03:30:43 localhost podman[87183]: 2025-11-23 08:30:43.052216295 +0000 UTC m=+0.108552584 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, release=1761123044, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4) Nov 23 03:30:43 localhost podman[87184]: 2025-11-23 08:30:43.059163375 +0000 UTC m=+0.113460529 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Nov 23 03:30:43 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:30:43 localhost podman[87185]: 2025-11-23 08:30:43.066571141 +0000 UTC m=+0.118950335 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:30:43 localhost podman[87191]: 2025-11-23 08:30:43.026828597 +0000 UTC m=+0.072137636 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:30:43 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:30:43 localhost podman[87184]: 2025-11-23 08:30:43.110153147 +0000 UTC m=+0.164450291 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute) Nov 23 03:30:43 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:30:43 localhost podman[87191]: 2025-11-23 08:30:43.160515738 +0000 UTC m=+0.205824787 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:30:43 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:30:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:30:44 localhost systemd[1]: tmp-crun.qXnwkn.mount: Deactivated successfully. Nov 23 03:30:44 localhost podman[87278]: 2025-11-23 08:30:44.035627003 +0000 UTC m=+0.093331090 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Nov 23 03:30:44 localhost podman[87278]: 2025-11-23 08:30:44.429455528 +0000 UTC m=+0.487159605 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1) Nov 23 03:30:44 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:30:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:30:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:30:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:30:46 localhost podman[87302]: 2025-11-23 08:30:46.010159335 +0000 UTC m=+0.069492992 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 23 03:30:46 localhost systemd[1]: tmp-crun.I7CIu0.mount: Deactivated successfully. Nov 23 03:30:46 localhost podman[87303]: 2025-11-23 08:30:46.06757929 +0000 UTC m=+0.124428028 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:30:46 localhost podman[87301]: 2025-11-23 08:30:46.0817032 +0000 UTC m=+0.138518187 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044) Nov 23 03:30:46 localhost podman[87302]: 2025-11-23 08:30:46.089906731 +0000 UTC m=+0.149240408 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12) Nov 23 03:30:46 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:30:46 localhost podman[87303]: 2025-11-23 08:30:46.109346389 +0000 UTC m=+0.166195187 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, distribution-scope=public, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.) Nov 23 03:30:46 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:30:46 localhost podman[87301]: 2025-11-23 08:30:46.251829291 +0000 UTC m=+0.308644358 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:30:46 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:31:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:31:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:31:11 localhost podman[87419]: 2025-11-23 08:31:11.023794847 +0000 UTC m=+0.082745443 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 23 03:31:11 localhost podman[87419]: 2025-11-23 08:31:11.063513515 +0000 UTC m=+0.122464111 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Nov 23 03:31:11 localhost podman[87420]: 2025-11-23 08:31:11.078004481 +0000 UTC m=+0.134371914 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:31:11 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:31:11 localhost podman[87420]: 2025-11-23 08:31:11.090388249 +0000 UTC m=+0.146755672 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, maintainer=OpenStack TripleO Team) Nov 23 03:31:11 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:31:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:31:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:31:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:31:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:31:14 localhost systemd[1]: tmp-crun.lYiY8x.mount: Deactivated successfully. Nov 23 03:31:14 localhost systemd[1]: tmp-crun.hP44WH.mount: Deactivated successfully. Nov 23 03:31:14 localhost podman[87459]: 2025-11-23 08:31:14.010248854 +0000 UTC m=+0.060543509 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, tcib_managed=true, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1) Nov 23 03:31:14 localhost podman[87460]: 2025-11-23 08:31:14.070400249 +0000 UTC m=+0.120325582 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:31:14 localhost podman[87458]: 2025-11-23 08:31:14.038642758 +0000 UTC m=+0.085214903 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:31:14 localhost podman[87458]: 2025-11-23 08:31:14.120265703 +0000 UTC m=+0.166837838 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:31:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:31:14 localhost podman[87460]: 2025-11-23 08:31:14.141359641 +0000 UTC m=+0.191284984 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:31:14 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:31:14 localhost podman[87457]: 2025-11-23 08:31:14.124084126 +0000 UTC m=+0.176487739 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron) Nov 23 03:31:14 localhost podman[87459]: 2025-11-23 08:31:14.195360709 +0000 UTC m=+0.245655454 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 23 03:31:14 localhost podman[87457]: 2025-11-23 08:31:14.208302425 +0000 UTC m=+0.260706048 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, version=17.1.12) Nov 23 03:31:14 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:31:14 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:31:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:31:15 localhost podman[87552]: 2025-11-23 08:31:15.00896817 +0000 UTC m=+0.070514980 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 23 03:31:15 localhost podman[87552]: 2025-11-23 08:31:15.385927746 +0000 UTC m=+0.447474616 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:31:15 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:31:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:31:17 localhost podman[87575]: 2025-11-23 08:31:17.021675334 +0000 UTC m=+0.079132427 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 23 03:31:17 localhost podman[87577]: 2025-11-23 08:31:17.047106022 +0000 UTC m=+0.096121324 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Nov 23 03:31:17 localhost podman[87577]: 2025-11-23 08:31:17.068228121 +0000 UTC m=+0.117243393 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Nov 23 03:31:17 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:31:17 localhost podman[87576]: 2025-11-23 08:31:17.134869395 +0000 UTC m=+0.186450679 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, release=1761123044, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:31:17 localhost podman[87576]: 2025-11-23 08:31:17.179255193 +0000 UTC m=+0.230836497 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:31:17 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:31:17 localhost podman[87575]: 2025-11-23 08:31:17.209161025 +0000 UTC m=+0.266618098 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z) Nov 23 03:31:17 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:31:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:31:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:31:42 localhost podman[87730]: 2025-11-23 08:31:42.033918684 +0000 UTC m=+0.085987348 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:31:42 localhost podman[87730]: 2025-11-23 08:31:42.048449971 +0000 UTC m=+0.100518635 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:31:42 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:31:42 localhost systemd[1]: tmp-crun.0xRh27.mount: Deactivated successfully. Nov 23 03:31:42 localhost podman[87731]: 2025-11-23 08:31:42.165013871 +0000 UTC m=+0.215789233 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:31:42 localhost podman[87731]: 2025-11-23 08:31:42.203290772 +0000 UTC m=+0.254066104 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:31:42 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:31:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:31:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:31:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:31:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:31:45 localhost systemd[1]: tmp-crun.NWKL3x.mount: Deactivated successfully. Nov 23 03:31:45 localhost podman[87768]: 2025-11-23 08:31:45.009199391 +0000 UTC m=+0.062343876 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-cron, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:31:45 localhost podman[87770]: 2025-11-23 08:31:45.05922843 +0000 UTC m=+0.107276301 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.) Nov 23 03:31:45 localhost podman[87768]: 2025-11-23 08:31:45.092676736 +0000 UTC m=+0.145821271 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible) Nov 23 03:31:45 localhost podman[87770]: 2025-11-23 08:31:45.100380374 +0000 UTC m=+0.148428165 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Nov 23 03:31:45 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:31:45 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:31:45 localhost podman[87776]: 2025-11-23 08:31:45.141869309 +0000 UTC m=+0.183298968 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:31:45 localhost podman[87769]: 2025-11-23 08:31:45.168358791 +0000 UTC m=+0.219262284 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:31:45 localhost podman[87769]: 2025-11-23 08:31:45.221243202 +0000 UTC m=+0.272146735 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 23 03:31:45 localhost podman[87776]: 2025-11-23 08:31:45.227339988 +0000 UTC m=+0.268769607 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, name=rhosp17/openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 23 03:31:45 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:31:45 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:31:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:31:46 localhost systemd[1]: tmp-crun.dxy7YN.mount: Deactivated successfully. Nov 23 03:31:46 localhost podman[87869]: 2025-11-23 08:31:46.023325813 +0000 UTC m=+0.079398335 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:31:46 localhost podman[87869]: 2025-11-23 08:31:46.39629879 +0000 UTC m=+0.452371282 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com) Nov 23 03:31:46 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:31:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:31:48 localhost systemd[1]: tmp-crun.YIp3G2.mount: Deactivated successfully. Nov 23 03:31:48 localhost podman[87894]: 2025-11-23 08:31:48.043794827 +0000 UTC m=+0.082290378 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 23 03:31:48 localhost podman[87894]: 2025-11-23 08:31:48.092355589 +0000 UTC m=+0.130851200 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:31:48 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:31:48 localhost podman[87892]: 2025-11-23 08:31:48.096162581 +0000 UTC m=+0.144656674 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:31:48 localhost podman[87893]: 2025-11-23 08:31:48.155663885 +0000 UTC m=+0.199069494 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 23 03:31:48 localhost podman[87893]: 2025-11-23 08:31:48.191046364 +0000 UTC m=+0.234451963 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:31:48 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:31:48 localhost podman[87892]: 2025-11-23 08:31:48.267174312 +0000 UTC m=+0.315668355 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:31:48 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:32:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:32:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 398 writes, 1428 keys, 398 commit groups, 1.0 writes per commit group, ingest: 1.75 MB, 0.00 MB/s#012Interval WAL: 398 writes, 167 syncs, 2.38 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:32:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:32:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 540 writes, 2288 keys, 540 commit groups, 1.0 writes per commit group, ingest: 2.77 MB, 0.00 MB/s#012Interval WAL: 540 writes, 176 syncs, 3.07 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:32:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:32:06 localhost recover_tripleo_nova_virtqemud[88011]: 61756 Nov 23 03:32:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:32:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:32:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:32:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:32:13 localhost podman[88013]: 2025-11-23 08:32:13.026027834 +0000 UTC m=+0.075663875 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true) Nov 23 03:32:13 localhost podman[88013]: 2025-11-23 08:32:13.062503577 +0000 UTC m=+0.112139648 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4) Nov 23 03:32:13 localhost podman[88012]: 2025-11-23 08:32:13.090098425 +0000 UTC m=+0.139709036 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:32:13 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:32:13 localhost podman[88012]: 2025-11-23 08:32:13.12848061 +0000 UTC m=+0.178091201 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible) Nov 23 03:32:13 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:32:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:32:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:32:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:32:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:32:16 localhost systemd[1]: tmp-crun.fEoeEy.mount: Deactivated successfully. Nov 23 03:32:16 localhost podman[88055]: 2025-11-23 08:32:16.032856106 +0000 UTC m=+0.078904600 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true) Nov 23 03:32:16 localhost podman[88054]: 2025-11-23 08:32:16.078308738 +0000 UTC m=+0.128228756 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 23 03:32:16 localhost podman[88055]: 2025-11-23 08:32:16.085292203 +0000 UTC m=+0.131340657 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:32:16 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:32:16 localhost podman[88053]: 2025-11-23 08:32:16.139957432 +0000 UTC m=+0.190447918 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:32:16 localhost podman[88054]: 2025-11-23 08:32:16.152187015 +0000 UTC m=+0.202107093 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:32:16 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:32:16 localhost podman[88053]: 2025-11-23 08:32:16.178309685 +0000 UTC m=+0.228800161 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z) Nov 23 03:32:16 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:32:16 localhost podman[88052]: 2025-11-23 08:32:16.237994115 +0000 UTC m=+0.290118214 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:32:16 localhost podman[88052]: 2025-11-23 08:32:16.275363677 +0000 UTC m=+0.327487776 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 23 03:32:16 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:32:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:32:16 localhost podman[88152]: 2025-11-23 08:32:16.991468693 +0000 UTC m=+0.052872202 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:32:17 localhost podman[88152]: 2025-11-23 08:32:17.386256802 +0000 UTC m=+0.447660271 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 23 03:32:17 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:32:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:32:19 localhost podman[88175]: 2025-11-23 08:32:19.017260597 +0000 UTC m=+0.074502418 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 23 03:32:19 localhost podman[88177]: 2025-11-23 08:32:19.063372239 +0000 UTC m=+0.112959984 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z) Nov 23 03:32:19 localhost systemd[1]: tmp-crun.15pflp.mount: Deactivated successfully. Nov 23 03:32:19 localhost podman[88176]: 2025-11-23 08:32:19.136548903 +0000 UTC m=+0.188391611 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true) Nov 23 03:32:19 localhost podman[88177]: 2025-11-23 08:32:19.158646034 +0000 UTC m=+0.208233809 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4) Nov 23 03:32:19 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:32:19 localhost podman[88176]: 2025-11-23 08:32:19.178328847 +0000 UTC m=+0.230171625 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12) Nov 23 03:32:19 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:32:19 localhost podman[88175]: 2025-11-23 08:32:19.254428825 +0000 UTC m=+0.311670716 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:32:19 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:32:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:32:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:32:44 localhost systemd[1]: tmp-crun.AaS38S.mount: Deactivated successfully. Nov 23 03:32:44 localhost podman[88325]: 2025-11-23 08:32:44.021065743 +0000 UTC m=+0.080491600 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 23 03:32:44 localhost podman[88325]: 2025-11-23 08:32:44.038365829 +0000 UTC m=+0.097791706 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid) Nov 23 03:32:44 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:32:44 localhost podman[88324]: 2025-11-23 08:32:44.040400455 +0000 UTC m=+0.097653112 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true) Nov 23 03:32:44 localhost podman[88324]: 2025-11-23 08:32:44.12479948 +0000 UTC m=+0.182052107 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd) Nov 23 03:32:44 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:32:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:32:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:32:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:32:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:32:47 localhost systemd[1]: tmp-crun.9nhUUL.mount: Deactivated successfully. Nov 23 03:32:47 localhost systemd[1]: tmp-crun.IPBnPM.mount: Deactivated successfully. Nov 23 03:32:47 localhost podman[88365]: 2025-11-23 08:32:47.098164037 +0000 UTC m=+0.148674083 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:32:47 localhost podman[88365]: 2025-11-23 08:32:47.127280333 +0000 UTC m=+0.177790379 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:32:47 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:32:47 localhost podman[88367]: 2025-11-23 08:32:47.06004222 +0000 UTC m=+0.101485755 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, tcib_managed=true, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:32:47 localhost podman[88364]: 2025-11-23 08:32:47.131563692 +0000 UTC m=+0.182752311 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z) Nov 23 03:32:47 localhost podman[88366]: 2025-11-23 08:32:47.190818967 +0000 UTC m=+0.238717639 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:32:47 localhost podman[88364]: 2025-11-23 08:32:47.210454609 +0000 UTC m=+0.261643258 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, release=1761123044, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:32:47 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:32:47 localhost podman[88366]: 2025-11-23 08:32:47.239217984 +0000 UTC m=+0.287116646 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Nov 23 03:32:47 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:32:47 localhost podman[88367]: 2025-11-23 08:32:47.294363568 +0000 UTC m=+0.335807123 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:32:47 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:32:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:32:48 localhost podman[88461]: 2025-11-23 08:32:48.019850615 +0000 UTC m=+0.077623988 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:32:48 localhost podman[88461]: 2025-11-23 08:32:48.397041528 +0000 UTC m=+0.454814941 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 23 03:32:48 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:32:49 localhost podman[88483]: 2025-11-23 08:32:49.993051668 +0000 UTC m=+0.055087203 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:32:50 localhost systemd[1]: tmp-crun.X5V4nq.mount: Deactivated successfully. Nov 23 03:32:50 localhost podman[88484]: 2025-11-23 08:32:50.029976856 +0000 UTC m=+0.084098186 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:32:50 localhost podman[88485]: 2025-11-23 08:32:50.090831264 +0000 UTC m=+0.139141348 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:34:05Z) Nov 23 03:32:50 localhost podman[88485]: 2025-11-23 08:32:50.114795605 +0000 UTC m=+0.163105779 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64) Nov 23 03:32:50 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:32:50 localhost podman[88484]: 2025-11-23 08:32:50.14201488 +0000 UTC m=+0.196136200 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:32:50 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:32:50 localhost podman[88483]: 2025-11-23 08:32:50.207021721 +0000 UTC m=+0.269057246 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:32:50 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:32:50 localhost systemd[1]: tmp-crun.vgEYd0.mount: Deactivated successfully. Nov 23 03:33:08 localhost sshd[88602]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:33:15 localhost podman[88604]: 2025-11-23 08:33:15.032854667 +0000 UTC m=+0.087616419 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true) Nov 23 03:33:15 localhost podman[88604]: 2025-11-23 08:33:15.042018802 +0000 UTC m=+0.096780574 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:33:15 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:33:15 localhost systemd[1]: tmp-crun.qoVCcp.mount: Deactivated successfully. Nov 23 03:33:15 localhost podman[88605]: 2025-11-23 08:33:15.134080194 +0000 UTC m=+0.186350796 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true) Nov 23 03:33:15 localhost podman[88605]: 2025-11-23 08:33:15.167591911 +0000 UTC m=+0.219862533 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid) Nov 23 03:33:15 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:33:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:33:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:33:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:33:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:33:18 localhost podman[88644]: 2025-11-23 08:33:18.017914729 +0000 UTC m=+0.071920365 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, version=17.1.12) Nov 23 03:33:18 localhost systemd[1]: tmp-crun.O1jLOz.mount: Deactivated successfully. Nov 23 03:33:18 localhost podman[88644]: 2025-11-23 08:33:18.073107965 +0000 UTC m=+0.127113581 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, release=1761123044) Nov 23 03:33:18 localhost podman[88643]: 2025-11-23 08:33:18.080871614 +0000 UTC m=+0.136044707 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Nov 23 03:33:18 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:33:18 localhost podman[88643]: 2025-11-23 08:33:18.110196168 +0000 UTC m=+0.165369261 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:33:18 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:33:18 localhost podman[88642]: 2025-11-23 08:33:18.129165908 +0000 UTC m=+0.186853632 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:33:18 localhost podman[88642]: 2025-11-23 08:33:18.134754128 +0000 UTC m=+0.192441812 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Nov 23 03:33:18 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:33:18 localhost podman[88645]: 2025-11-23 08:33:18.188354112 +0000 UTC m=+0.238534894 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 23 03:33:18 localhost podman[88645]: 2025-11-23 08:33:18.215269867 +0000 UTC m=+0.265450649 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5) Nov 23 03:33:18 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:33:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:33:19 localhost podman[88743]: 2025-11-23 08:33:19.084233931 +0000 UTC m=+0.062605756 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute) Nov 23 03:33:19 localhost podman[88743]: 2025-11-23 08:33:19.469948308 +0000 UTC m=+0.448320093 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 23 03:33:19 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:33:21 localhost systemd[1]: tmp-crun.wSjpeH.mount: Deactivated successfully. Nov 23 03:33:21 localhost podman[88766]: 2025-11-23 08:33:21.009027786 +0000 UTC m=+0.070283131 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public) Nov 23 03:33:21 localhost podman[88767]: 2025-11-23 08:33:21.020520906 +0000 UTC m=+0.076465171 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:33:21 localhost podman[88768]: 2025-11-23 08:33:21.087246792 +0000 UTC m=+0.140013104 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:33:21 localhost podman[88767]: 2025-11-23 08:33:21.10768501 +0000 UTC m=+0.163629305 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team) Nov 23 03:33:21 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:33:21 localhost podman[88768]: 2025-11-23 08:33:21.159435774 +0000 UTC m=+0.212202146 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:33:21 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:33:21 localhost podman[88766]: 2025-11-23 08:33:21.186292378 +0000 UTC m=+0.247547743 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:33:21 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:33:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:33:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:33:46 localhost podman[88971]: 2025-11-23 08:33:46.033672358 +0000 UTC m=+0.077960109 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container) Nov 23 03:33:46 localhost podman[88971]: 2025-11-23 08:33:46.069047235 +0000 UTC m=+0.113334966 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Nov 23 03:33:46 localhost podman[88970]: 2025-11-23 08:33:46.086841948 +0000 UTC m=+0.134225189 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Nov 23 03:33:46 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:33:46 localhost podman[88970]: 2025-11-23 08:33:46.094149012 +0000 UTC m=+0.141532253 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12) Nov 23 03:33:46 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:33:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:33:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:33:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:33:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:33:49 localhost systemd[1]: tmp-crun.vHSG9w.mount: Deactivated successfully. Nov 23 03:33:49 localhost podman[89017]: 2025-11-23 08:33:49.086734466 +0000 UTC m=+0.131643906 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, distribution-scope=public) Nov 23 03:33:49 localhost podman[89009]: 2025-11-23 08:33:49.053906941 +0000 UTC m=+0.111675574 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:33:49 localhost podman[89009]: 2025-11-23 08:33:49.137271412 +0000 UTC m=+0.195040045 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:33:49 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:33:49 localhost podman[89011]: 2025-11-23 08:33:49.138385238 +0000 UTC m=+0.188945740 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git) Nov 23 03:33:49 localhost podman[89010]: 2025-11-23 08:33:49.194938387 +0000 UTC m=+0.246604044 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-type=git) Nov 23 03:33:49 localhost podman[89017]: 2025-11-23 08:33:49.213597168 +0000 UTC m=+0.258506608 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:33:49 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:33:49 localhost podman[89010]: 2025-11-23 08:33:49.269723773 +0000 UTC m=+0.321389420 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044) Nov 23 03:33:49 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:33:49 localhost podman[89011]: 2025-11-23 08:33:49.320955581 +0000 UTC m=+0.371516123 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z) Nov 23 03:33:49 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:33:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:33:50 localhost podman[89104]: 2025-11-23 08:33:50.023149009 +0000 UTC m=+0.075606553 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:33:50 localhost podman[89104]: 2025-11-23 08:33:50.407239064 +0000 UTC m=+0.459696568 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true) Nov 23 03:33:50 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:33:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:33:52 localhost systemd[1]: tmp-crun.KkkSE0.mount: Deactivated successfully. Nov 23 03:33:52 localhost podman[89127]: 2025-11-23 08:33:52.027010038 +0000 UTC m=+0.086855385 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible) Nov 23 03:33:52 localhost podman[89128]: 2025-11-23 08:33:52.037508616 +0000 UTC m=+0.092654422 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible) Nov 23 03:33:52 localhost podman[89129]: 2025-11-23 08:33:52.091042008 +0000 UTC m=+0.145248694 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:33:52 localhost podman[89128]: 2025-11-23 08:33:52.102257268 +0000 UTC m=+0.157403074 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git) Nov 23 03:33:52 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:33:52 localhost podman[89129]: 2025-11-23 08:33:52.116368192 +0000 UTC m=+0.170574908 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4) Nov 23 03:33:52 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:33:52 localhost podman[89127]: 2025-11-23 08:33:52.237077745 +0000 UTC m=+0.296923062 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 23 03:33:52 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:33:58 localhost sshd[89201]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:34:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:34:00 localhost recover_tripleo_nova_virtqemud[89227]: 61756 Nov 23 03:34:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:34:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:34:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:34:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:34:17 localhost systemd[1]: tmp-crun.xirYir.mount: Deactivated successfully. Nov 23 03:34:17 localhost podman[89228]: 2025-11-23 08:34:17.025454607 +0000 UTC m=+0.083836708 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:34:17 localhost podman[89228]: 2025-11-23 08:34:17.059662467 +0000 UTC m=+0.118044508 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3) Nov 23 03:34:17 localhost podman[89229]: 2025-11-23 08:34:17.071056903 +0000 UTC m=+0.123680479 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:34:17 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:34:17 localhost podman[89229]: 2025-11-23 08:34:17.103382623 +0000 UTC m=+0.156006259 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com) Nov 23 03:34:17 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:34:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:34:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:34:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:34:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:34:20 localhost systemd[1]: tmp-crun.X9pnuX.mount: Deactivated successfully. Nov 23 03:34:20 localhost podman[89268]: 2025-11-23 08:34:20.025341923 +0000 UTC m=+0.079302172 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:34:20 localhost podman[89267]: 2025-11-23 08:34:20.067931064 +0000 UTC m=+0.121502520 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:34:20 localhost podman[89268]: 2025-11-23 08:34:20.072064056 +0000 UTC m=+0.126024325 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public) Nov 23 03:34:20 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:34:20 localhost podman[89267]: 2025-11-23 08:34:20.104334405 +0000 UTC m=+0.157905881 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Nov 23 03:34:20 localhost podman[89271]: 2025-11-23 08:34:20.110821353 +0000 UTC m=+0.153818769 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute) Nov 23 03:34:20 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:34:20 localhost podman[89271]: 2025-11-23 08:34:20.138261666 +0000 UTC m=+0.181259122 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container) Nov 23 03:34:20 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:34:20 localhost podman[89269]: 2025-11-23 08:34:20.151409688 +0000 UTC m=+0.198167295 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:34:20 localhost podman[89269]: 2025-11-23 08:34:20.178625314 +0000 UTC m=+0.225382911 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 23 03:34:20 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:34:21 localhost podman[89367]: 2025-11-23 08:34:21.025998162 +0000 UTC m=+0.082078132 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, release=1761123044) Nov 23 03:34:21 localhost podman[89367]: 2025-11-23 08:34:21.395826579 +0000 UTC m=+0.451906569 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target) Nov 23 03:34:21 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:34:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:34:22 localhost systemd[1]: tmp-crun.tAariD.mount: Deactivated successfully. Nov 23 03:34:23 localhost podman[89390]: 2025-11-23 08:34:22.996629192 +0000 UTC m=+0.058681458 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:34:23 localhost podman[89389]: 2025-11-23 08:34:23.05067551 +0000 UTC m=+0.113713468 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:34:23 localhost podman[89391]: 2025-11-23 08:34:23.02173125 +0000 UTC m=+0.080524612 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Nov 23 03:34:23 localhost podman[89390]: 2025-11-23 08:34:23.085365917 +0000 UTC m=+0.147418213 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 03:34:23 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:34:23 localhost podman[89391]: 2025-11-23 08:34:23.106366472 +0000 UTC m=+0.165159894 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:34:23 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:34:23 localhost podman[89389]: 2025-11-23 08:34:23.246541951 +0000 UTC m=+0.309579879 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:34:23 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:34:23 localhost systemd[1]: tmp-crun.dFbzus.mount: Deactivated successfully. Nov 23 03:34:39 localhost sshd[89466]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:34:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:34:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:34:48 localhost systemd[1]: tmp-crun.fPjKmo.mount: Deactivated successfully. Nov 23 03:34:48 localhost podman[89543]: 2025-11-23 08:34:48.051139024 +0000 UTC m=+0.104389350 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:34:48 localhost podman[89544]: 2025-11-23 08:34:48.010115254 +0000 UTC m=+0.064569878 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:34:48 localhost podman[89544]: 2025-11-23 08:34:48.094162067 +0000 UTC m=+0.148616661 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:34:48 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:34:48 localhost podman[89543]: 2025-11-23 08:34:48.115216334 +0000 UTC m=+0.168466740 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044) Nov 23 03:34:48 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:34:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:34:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:34:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:34:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:34:51 localhost systemd[1]: tmp-crun.3LYz0X.mount: Deactivated successfully. Nov 23 03:34:51 localhost podman[89581]: 2025-11-23 08:34:51.103730317 +0000 UTC m=+0.145727348 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi) Nov 23 03:34:51 localhost podman[89580]: 2025-11-23 08:34:51.149045475 +0000 UTC m=+0.192530834 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:34:51 localhost podman[89581]: 2025-11-23 08:34:51.159384518 +0000 UTC m=+0.201381559 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:34:51 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:34:51 localhost podman[89579]: 2025-11-23 08:34:51.068556206 +0000 UTC m=+0.116679614 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=) Nov 23 03:34:51 localhost podman[89580]: 2025-11-23 08:34:51.186256383 +0000 UTC m=+0.229741742 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:11:48Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 23 03:34:51 localhost podman[89585]: 2025-11-23 08:34:51.199706595 +0000 UTC m=+0.235400293 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:34:51 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:34:51 localhost podman[89585]: 2025-11-23 08:34:51.224495292 +0000 UTC m=+0.260189030 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5) Nov 23 03:34:51 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:34:51 localhost podman[89579]: 2025-11-23 08:34:51.250129167 +0000 UTC m=+0.298252495 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Nov 23 03:34:51 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:34:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:34:52 localhost podman[89675]: 2025-11-23 08:34:52.023802174 +0000 UTC m=+0.080484760 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, io.buildah.version=1.41.4) Nov 23 03:34:52 localhost systemd[1]: tmp-crun.45Ly8Y.mount: Deactivated successfully. Nov 23 03:34:52 localhost podman[89675]: 2025-11-23 08:34:52.418709397 +0000 UTC m=+0.475391973 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=) Nov 23 03:34:52 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:34:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:34:54 localhost podman[89699]: 2025-11-23 08:34:54.037830011 +0000 UTC m=+0.094688187 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 23 03:34:54 localhost podman[89701]: 2025-11-23 08:34:54.07636993 +0000 UTC m=+0.128745323 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:34:54 localhost podman[89700]: 2025-11-23 08:34:54.130551053 +0000 UTC m=+0.184570518 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z) Nov 23 03:34:54 localhost podman[89701]: 2025-11-23 08:34:54.15594282 +0000 UTC m=+0.208318213 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git) Nov 23 03:34:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:34:54 localhost podman[89700]: 2025-11-23 08:34:54.195264605 +0000 UTC m=+0.249284030 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:34:54 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:34:54 localhost podman[89699]: 2025-11-23 08:34:54.264348477 +0000 UTC m=+0.321206623 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=) Nov 23 03:34:54 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:35:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:35:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:35:18 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:35:18 localhost recover_tripleo_nova_virtqemud[89802]: 61756 Nov 23 03:35:18 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:35:18 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:35:19 localhost systemd[1]: tmp-crun.OjCDGM.mount: Deactivated successfully. Nov 23 03:35:19 localhost podman[89795]: 2025-11-23 08:35:19.043158492 +0000 UTC m=+0.098098170 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:35:19 localhost podman[89794]: 2025-11-23 08:35:19.0861396 +0000 UTC m=+0.143645206 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:35:19 localhost podman[89795]: 2025-11-23 08:35:19.110342613 +0000 UTC m=+0.165282241 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, config_id=tripleo_step3) Nov 23 03:35:19 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:35:19 localhost podman[89794]: 2025-11-23 08:35:19.120989739 +0000 UTC m=+0.178495315 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:35:19 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:35:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:35:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:35:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:35:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:35:22 localhost podman[89836]: 2025-11-23 08:35:22.026842581 +0000 UTC m=+0.081354446 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 23 03:35:22 localhost podman[89836]: 2025-11-23 08:35:22.040306214 +0000 UTC m=+0.094818119 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true) Nov 23 03:35:22 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:35:22 localhost podman[89843]: 2025-11-23 08:35:22.099227181 +0000 UTC m=+0.144213874 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:35:22 localhost podman[89838]: 2025-11-23 08:35:22.136968468 +0000 UTC m=+0.186013425 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:35:22 localhost podman[89837]: 2025-11-23 08:35:22.182972409 +0000 UTC m=+0.232270194 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, tcib_managed=true) Nov 23 03:35:22 localhost podman[89838]: 2025-11-23 08:35:22.19440682 +0000 UTC m=+0.243451797 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Nov 23 03:35:22 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:35:22 localhost podman[89843]: 2025-11-23 08:35:22.213597608 +0000 UTC m=+0.258584301 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:35:22 localhost podman[89837]: 2025-11-23 08:35:22.236068797 +0000 UTC m=+0.285366522 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, tcib_managed=true, architecture=x86_64) Nov 23 03:35:22 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:35:22 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:35:23 localhost podman[89935]: 2025-11-23 08:35:23.001734801 +0000 UTC m=+0.060564449 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z) Nov 23 03:35:23 localhost podman[89935]: 2025-11-23 08:35:23.35742784 +0000 UTC m=+0.416257518 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 23 03:35:23 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:35:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:35:25 localhost systemd[1]: tmp-crun.tQGMWw.mount: Deactivated successfully. Nov 23 03:35:25 localhost podman[89958]: 2025-11-23 08:35:25.026950284 +0000 UTC m=+0.085770352 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 23 03:35:25 localhost podman[89960]: 2025-11-23 08:35:25.064300349 +0000 UTC m=+0.115924886 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:35:25 localhost podman[89960]: 2025-11-23 08:35:25.109725672 +0000 UTC m=+0.161350219 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:35:25 localhost podman[89959]: 2025-11-23 08:35:25.117018196 +0000 UTC m=+0.172341297 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:35:25 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:35:25 localhost podman[89959]: 2025-11-23 08:35:25.149160512 +0000 UTC m=+0.204483623 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:35:25 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:35:25 localhost podman[89958]: 2025-11-23 08:35:25.211333979 +0000 UTC m=+0.270154037 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 23 03:35:25 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:35:43 localhost systemd[1]: tmp-crun.MnPHTp.mount: Deactivated successfully. Nov 23 03:35:43 localhost podman[90134]: 2025-11-23 08:35:43.456082132 +0000 UTC m=+0.061833888 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , ceph=True, name=rhceph, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=) Nov 23 03:35:43 localhost podman[90134]: 2025-11-23 08:35:43.544392901 +0000 UTC m=+0.150144707 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12) Nov 23 03:35:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:35:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:35:50 localhost podman[90279]: 2025-11-23 08:35:50.017851284 +0000 UTC m=+0.073893868 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12) Nov 23 03:35:50 localhost podman[90279]: 2025-11-23 08:35:50.033363539 +0000 UTC m=+0.089406233 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:35:50 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:35:50 localhost systemd[1]: tmp-crun.vIACbQ.mount: Deactivated successfully. Nov 23 03:35:50 localhost podman[90278]: 2025-11-23 08:35:50.126999292 +0000 UTC m=+0.184216961 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:35:50 localhost podman[90278]: 2025-11-23 08:35:50.135692368 +0000 UTC m=+0.192910037 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-collectd) Nov 23 03:35:50 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:35:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:35:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:35:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:35:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:35:53 localhost podman[90316]: 2025-11-23 08:35:53.042777819 +0000 UTC m=+0.096035566 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Nov 23 03:35:53 localhost podman[90316]: 2025-11-23 08:35:53.052276771 +0000 UTC m=+0.105534558 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z) Nov 23 03:35:53 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:35:53 localhost podman[90318]: 2025-11-23 08:35:53.09397641 +0000 UTC m=+0.140867612 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:35:53 localhost podman[90318]: 2025-11-23 08:35:53.123474375 +0000 UTC m=+0.170365527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi) Nov 23 03:35:53 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:35:53 localhost podman[90317]: 2025-11-23 08:35:53.12919101 +0000 UTC m=+0.179020882 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, config_id=tripleo_step4) Nov 23 03:35:53 localhost podman[90319]: 2025-11-23 08:35:53.189117468 +0000 UTC m=+0.232685247 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public) Nov 23 03:35:53 localhost podman[90317]: 2025-11-23 08:35:53.213357361 +0000 UTC m=+0.263187293 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:35:53 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:35:53 localhost podman[90319]: 2025-11-23 08:35:53.245377444 +0000 UTC m=+0.288945253 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Nov 23 03:35:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:35:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:35:54 localhost podman[90415]: 2025-11-23 08:35:54.021064504 +0000 UTC m=+0.078589941 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, vcs-type=git) Nov 23 03:35:54 localhost podman[90415]: 2025-11-23 08:35:54.413532431 +0000 UTC m=+0.471057858 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, distribution-scope=public, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4) Nov 23 03:35:54 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:35:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:35:56 localhost systemd[1]: tmp-crun.YiNM0b.mount: Deactivated successfully. Nov 23 03:35:56 localhost podman[90438]: 2025-11-23 08:35:56.016940038 +0000 UTC m=+0.078585292 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, container_name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:35:56 localhost podman[90445]: 2025-11-23 08:35:56.078226547 +0000 UTC m=+0.129707568 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:35:56 localhost podman[90439]: 2025-11-23 08:35:56.044147802 +0000 UTC m=+0.097457610 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 23 03:35:56 localhost podman[90445]: 2025-11-23 08:35:56.096301191 +0000 UTC m=+0.147782252 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.4, distribution-scope=public) Nov 23 03:35:56 localhost podman[90439]: 2025-11-23 08:35:56.133249605 +0000 UTC m=+0.186559393 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 03:35:56 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:35:56 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:35:56 localhost podman[90438]: 2025-11-23 08:35:56.1953507 +0000 UTC m=+0.256995964 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:35:56 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:35:57 localhost systemd[1]: tmp-crun.Fyex4t.mount: Deactivated successfully. Nov 23 03:36:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:36:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:36:21 localhost systemd[1]: tmp-crun.iPLPhF.mount: Deactivated successfully. Nov 23 03:36:21 localhost podman[90533]: 2025-11-23 08:36:21.037063384 +0000 UTC m=+0.092583120 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3) Nov 23 03:36:21 localhost systemd[1]: tmp-crun.gR5Ff3.mount: Deactivated successfully. Nov 23 03:36:21 localhost podman[90534]: 2025-11-23 08:36:21.076217575 +0000 UTC m=+0.128767920 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4) Nov 23 03:36:21 localhost podman[90533]: 2025-11-23 08:36:21.099617333 +0000 UTC m=+0.155137079 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:36:21 localhost podman[90534]: 2025-11-23 08:36:21.113405956 +0000 UTC m=+0.165956302 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3) Nov 23 03:36:21 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:36:21 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:36:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:36:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:36:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:36:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:36:24 localhost systemd[1]: tmp-crun.83HA05.mount: Deactivated successfully. Nov 23 03:36:24 localhost podman[90577]: 2025-11-23 08:36:24.046401793 +0000 UTC m=+0.083629586 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=nova_compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true) Nov 23 03:36:24 localhost podman[90577]: 2025-11-23 08:36:24.101148301 +0000 UTC m=+0.138376054 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:36:24 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:36:24 localhost podman[90575]: 2025-11-23 08:36:24.15164557 +0000 UTC m=+0.193700402 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:36:24 localhost podman[90575]: 2025-11-23 08:36:24.181352722 +0000 UTC m=+0.223407574 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:36:24 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:36:24 localhost podman[90576]: 2025-11-23 08:36:24.197910349 +0000 UTC m=+0.239524567 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64) Nov 23 03:36:24 localhost podman[90574]: 2025-11-23 08:36:24.104529865 +0000 UTC m=+0.150687732 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:36:24 localhost podman[90576]: 2025-11-23 08:36:24.231207631 +0000 UTC m=+0.272821839 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:36:24 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:36:24 localhost podman[90574]: 2025-11-23 08:36:24.241999901 +0000 UTC m=+0.288157858 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-type=git, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:36:24 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:36:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:36:25 localhost podman[90670]: 2025-11-23 08:36:25.016677631 +0000 UTC m=+0.078452097 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:36:25 localhost systemd[1]: tmp-crun.Dbhz9h.mount: Deactivated successfully. Nov 23 03:36:25 localhost podman[90670]: 2025-11-23 08:36:25.415475102 +0000 UTC m=+0.477249608 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:36:25 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:36:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:36:26 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:36:26 localhost recover_tripleo_nova_virtqemud[90708]: 61756 Nov 23 03:36:26 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:36:26 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:36:27 localhost systemd[1]: tmp-crun.EkF1r9.mount: Deactivated successfully. Nov 23 03:36:27 localhost podman[90693]: 2025-11-23 08:36:27.037700026 +0000 UTC m=+0.096249863 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:36:27 localhost podman[90695]: 2025-11-23 08:36:27.088398521 +0000 UTC m=+0.139339234 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 23 03:36:27 localhost podman[90695]: 2025-11-23 08:36:27.131667808 +0000 UTC m=+0.182608521 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com) Nov 23 03:36:27 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:36:27 localhost podman[90694]: 2025-11-23 08:36:27.16723522 +0000 UTC m=+0.223547738 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Nov 23 03:36:27 localhost podman[90694]: 2025-11-23 08:36:27.200571751 +0000 UTC m=+0.256884299 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64) Nov 23 03:36:27 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:36:27 localhost podman[90693]: 2025-11-23 08:36:27.23313331 +0000 UTC m=+0.291683117 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12) Nov 23 03:36:27 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:36:28 localhost systemd[1]: tmp-crun.5PRTtI.mount: Deactivated successfully. Nov 23 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:36:52 localhost systemd[1]: tmp-crun.1djAn2.mount: Deactivated successfully. Nov 23 03:36:52 localhost podman[90848]: 2025-11-23 08:36:52.058289645 +0000 UTC m=+0.116865665 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.12) Nov 23 03:36:52 localhost podman[90849]: 2025-11-23 08:36:52.103286465 +0000 UTC m=+0.159044819 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, release=1761123044) Nov 23 03:36:52 localhost podman[90848]: 2025-11-23 08:36:52.118213693 +0000 UTC m=+0.176789703 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Nov 23 03:36:52 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:36:52 localhost podman[90849]: 2025-11-23 08:36:52.137863136 +0000 UTC m=+0.193621450 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true) Nov 23 03:36:52 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:36:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:36:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:36:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:36:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:36:55 localhost systemd[1]: tmp-crun.8zCyb8.mount: Deactivated successfully. Nov 23 03:36:55 localhost systemd[1]: tmp-crun.sKvl5s.mount: Deactivated successfully. Nov 23 03:36:55 localhost podman[90899]: 2025-11-23 08:36:55.051602601 +0000 UTC m=+0.098554274 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Nov 23 03:36:55 localhost podman[90888]: 2025-11-23 08:36:55.083710676 +0000 UTC m=+0.140524361 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 23 03:36:55 localhost podman[90899]: 2025-11-23 08:36:55.100157811 +0000 UTC m=+0.147109504 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Nov 23 03:36:55 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:36:55 localhost podman[90890]: 2025-11-23 08:36:55.148456862 +0000 UTC m=+0.199273963 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 23 03:36:55 localhost podman[90889]: 2025-11-23 08:36:55.018252858 +0000 UTC m=+0.075717074 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:36:55 localhost podman[90890]: 2025-11-23 08:36:55.174181371 +0000 UTC m=+0.224998482 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Nov 23 03:36:55 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:36:55 localhost podman[90889]: 2025-11-23 08:36:55.198563639 +0000 UTC m=+0.256027855 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:36:55 localhost podman[90888]: 2025-11-23 08:36:55.221073529 +0000 UTC m=+0.277887204 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:36:55 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:36:55 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:36:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:36:56 localhost systemd[1]: tmp-crun.P6RBAC.mount: Deactivated successfully. Nov 23 03:36:56 localhost podman[90988]: 2025-11-23 08:36:56.030609598 +0000 UTC m=+0.085230085 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true) Nov 23 03:36:56 localhost podman[90988]: 2025-11-23 08:36:56.407163647 +0000 UTC m=+0.461784074 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 23 03:36:56 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:36:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:36:58 localhost systemd[1]: tmp-crun.wpbGj5.mount: Deactivated successfully. Nov 23 03:36:58 localhost podman[91015]: 2025-11-23 08:36:58.034409795 +0000 UTC m=+0.089611940 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:36:58 localhost podman[91016]: 2025-11-23 08:36:58.00915152 +0000 UTC m=+0.064689025 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.openshift.expose-services=, architecture=x86_64) Nov 23 03:36:58 localhost podman[91014]: 2025-11-23 08:36:58.069800291 +0000 UTC m=+0.129113921 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Nov 23 03:36:58 localhost podman[91016]: 2025-11-23 08:36:58.095287352 +0000 UTC m=+0.150824867 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z) Nov 23 03:36:58 localhost podman[91015]: 2025-11-23 08:36:58.103273617 +0000 UTC m=+0.158475642 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 23 03:36:58 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:36:58 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:36:58 localhost podman[91014]: 2025-11-23 08:36:58.243265691 +0000 UTC m=+0.302579291 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:36:58 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:37:23 localhost podman[91088]: 2025-11-23 08:37:23.033753231 +0000 UTC m=+0.085431182 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:37:23 localhost podman[91088]: 2025-11-23 08:37:23.046198892 +0000 UTC m=+0.097876863 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:37:23 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:37:23 localhost systemd[1]: tmp-crun.eSETIk.mount: Deactivated successfully. Nov 23 03:37:23 localhost podman[91089]: 2025-11-23 08:37:23.143674172 +0000 UTC m=+0.192263327 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:37:23 localhost podman[91089]: 2025-11-23 08:37:23.157262689 +0000 UTC m=+0.205851844 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:37:23 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:37:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:37:26 localhost podman[91128]: 2025-11-23 08:37:26.035199466 +0000 UTC m=+0.090728143 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond) Nov 23 03:37:26 localhost podman[91136]: 2025-11-23 08:37:26.042393697 +0000 UTC m=+0.087179535 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:37:26 localhost podman[91136]: 2025-11-23 08:37:26.062539315 +0000 UTC m=+0.107325183 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:37:26 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:37:26 localhost podman[91128]: 2025-11-23 08:37:26.07119347 +0000 UTC m=+0.126722107 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, container_name=logrotate_crond, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 23 03:37:26 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:37:26 localhost podman[91130]: 2025-11-23 08:37:26.127433145 +0000 UTC m=+0.175222305 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:37:26 localhost podman[91129]: 2025-11-23 08:37:26.14356923 +0000 UTC m=+0.194803585 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:37:26 localhost podman[91130]: 2025-11-23 08:37:26.177274204 +0000 UTC m=+0.225063424 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 23 03:37:26 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:37:26 localhost podman[91129]: 2025-11-23 08:37:26.199264738 +0000 UTC m=+0.250499103 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 23 03:37:26 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:37:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:37:27 localhost podman[91227]: 2025-11-23 08:37:27.026128408 +0000 UTC m=+0.082622944 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Nov 23 03:37:27 localhost podman[91227]: 2025-11-23 08:37:27.42037068 +0000 UTC m=+0.476865206 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:37:27 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:37:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:37:29 localhost podman[91250]: 2025-11-23 08:37:29.022125297 +0000 UTC m=+0.077470697 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:37:29 localhost systemd[1]: tmp-crun.n8Udr8.mount: Deactivated successfully. Nov 23 03:37:29 localhost podman[91251]: 2025-11-23 08:37:29.083040994 +0000 UTC m=+0.133397751 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:37:29 localhost podman[91252]: 2025-11-23 08:37:29.05546394 +0000 UTC m=+0.103012751 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, version=17.1.12) Nov 23 03:37:29 localhost podman[91251]: 2025-11-23 08:37:29.114096667 +0000 UTC m=+0.164453424 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:37:29 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:37:29 localhost podman[91252]: 2025-11-23 08:37:29.141182448 +0000 UTC m=+0.188731229 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git) Nov 23 03:37:29 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:37:29 localhost podman[91250]: 2025-11-23 08:37:29.228323971 +0000 UTC m=+0.283669411 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team) Nov 23 03:37:29 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:37:54 localhost systemd[1]: tmp-crun.CEAxQD.mount: Deactivated successfully. Nov 23 03:37:54 localhost podman[91403]: 2025-11-23 08:37:54.074453858 +0000 UTC m=+0.126039267 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:37:54 localhost podman[91404]: 2025-11-23 08:37:54.049206034 +0000 UTC m=+0.100998179 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:37:54 localhost podman[91403]: 2025-11-23 08:37:54.112282999 +0000 UTC m=+0.163868448 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z) Nov 23 03:37:54 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:37:54 localhost podman[91404]: 2025-11-23 08:37:54.131447577 +0000 UTC m=+0.183239752 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:37:54 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:37:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:37:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:37:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:37:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:37:57 localhost podman[91440]: 2025-11-23 08:37:57.04252658 +0000 UTC m=+0.093280211 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64) Nov 23 03:37:57 localhost podman[91440]: 2025-11-23 08:37:57.057506809 +0000 UTC m=+0.108260490 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=) Nov 23 03:37:57 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:37:57 localhost systemd[1]: tmp-crun.qr5Qbr.mount: Deactivated successfully. Nov 23 03:37:57 localhost podman[91442]: 2025-11-23 08:37:57.161559201 +0000 UTC m=+0.204400470 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z, tcib_managed=true, managed_by=tripleo_ansible) Nov 23 03:37:57 localhost podman[91441]: 2025-11-23 08:37:57.213936887 +0000 UTC m=+0.257278492 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z) Nov 23 03:37:57 localhost podman[91441]: 2025-11-23 08:37:57.24921476 +0000 UTC m=+0.292556365 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:37:57 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:37:57 localhost podman[91442]: 2025-11-23 08:37:57.272491583 +0000 UTC m=+0.315332842 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:37:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:37:57 localhost podman[91447]: 2025-11-23 08:37:57.352200508 +0000 UTC m=+0.391567210 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:37:57 localhost podman[91447]: 2025-11-23 08:37:57.387423748 +0000 UTC m=+0.426790510 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 23 03:37:57 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:37:58 localhost podman[91538]: 2025-11-23 08:37:58.024609731 +0000 UTC m=+0.077560119 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:37:58 localhost systemd[1]: tmp-crun.ZYSkNp.mount: Deactivated successfully. Nov 23 03:37:58 localhost podman[91538]: 2025-11-23 08:37:58.398328924 +0000 UTC m=+0.451279312 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 23 03:37:58 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:37:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:38:00 localhost systemd[1]: tmp-crun.2nJEwu.mount: Deactivated successfully. Nov 23 03:38:00 localhost podman[91561]: 2025-11-23 08:38:00.048578149 +0000 UTC m=+0.103668641 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=metrics_qdr, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=) Nov 23 03:38:00 localhost podman[91562]: 2025-11-23 08:38:00.098032956 +0000 UTC m=+0.150597300 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:38:00 localhost podman[91563]: 2025-11-23 08:38:00.152066803 +0000 UTC m=+0.199181910 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:38:00 localhost podman[91562]: 2025-11-23 08:38:00.174226953 +0000 UTC m=+0.226791307 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 03:38:00 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:38:00 localhost podman[91563]: 2025-11-23 08:38:00.226211576 +0000 UTC m=+0.273326773 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true) Nov 23 03:38:00 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:38:00 localhost podman[91561]: 2025-11-23 08:38:00.303424295 +0000 UTC m=+0.358514777 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:38:00 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:38:01 localhost systemd[1]: tmp-crun.GmnmPR.mount: Deactivated successfully. Nov 23 03:38:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:38:16 localhost recover_tripleo_nova_virtqemud[91638]: 61756 Nov 23 03:38:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:38:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:38:17 localhost sshd[91639]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:38:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:38:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:38:25 localhost systemd[1]: tmp-crun.fh5b1E.mount: Deactivated successfully. Nov 23 03:38:25 localhost podman[91642]: 2025-11-23 08:38:25.044118202 +0000 UTC m=+0.095337956 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid) Nov 23 03:38:25 localhost podman[91642]: 2025-11-23 08:38:25.083302693 +0000 UTC m=+0.134522387 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true) Nov 23 03:38:25 localhost podman[91641]: 2025-11-23 08:38:25.094512427 +0000 UTC m=+0.145666999 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, architecture=x86_64, tcib_managed=true) Nov 23 03:38:25 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:38:25 localhost podman[91641]: 2025-11-23 08:38:25.107564887 +0000 UTC m=+0.158719459 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 23 03:38:25 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:38:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:38:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:38:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:38:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:38:28 localhost systemd[1]: tmp-crun.OJODhX.mount: Deactivated successfully. Nov 23 03:38:28 localhost podman[91681]: 2025-11-23 08:38:28.03786026 +0000 UTC m=+0.084527323 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z) Nov 23 03:38:28 localhost podman[91682]: 2025-11-23 08:38:28.065276951 +0000 UTC m=+0.107395244 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:38:28 localhost podman[91681]: 2025-11-23 08:38:28.135557167 +0000 UTC m=+0.182224230 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Nov 23 03:38:28 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:38:28 localhost podman[91682]: 2025-11-23 08:38:28.148969638 +0000 UTC m=+0.191087921 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 23 03:38:28 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:38:28 localhost podman[91680]: 2025-11-23 08:38:28.140677964 +0000 UTC m=+0.190820703 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1) Nov 23 03:38:28 localhost podman[91683]: 2025-11-23 08:38:28.100166441 +0000 UTC m=+0.140079517 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:38:28 localhost podman[91680]: 2025-11-23 08:38:28.225282069 +0000 UTC m=+0.275424878 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 23 03:38:28 localhost podman[91683]: 2025-11-23 08:38:28.232273194 +0000 UTC m=+0.272186320 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:38:28 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:38:28 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:38:29 localhost podman[91778]: 2025-11-23 08:38:29.022350225 +0000 UTC m=+0.079225871 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:38:29 localhost systemd[1]: tmp-crun.8XXO5q.mount: Deactivated successfully. Nov 23 03:38:29 localhost podman[91778]: 2025-11-23 08:38:29.423728256 +0000 UTC m=+0.480603872 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:38:29 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:38:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:38:31 localhost systemd[1]: tmp-crun.iQBf3Z.mount: Deactivated successfully. Nov 23 03:38:31 localhost podman[91803]: 2025-11-23 08:38:31.03617966 +0000 UTC m=+0.092139626 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:38:31 localhost podman[91803]: 2025-11-23 08:38:31.079627913 +0000 UTC m=+0.135587879 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:38:31 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:38:31 localhost podman[91804]: 2025-11-23 08:38:31.130653927 +0000 UTC m=+0.183650093 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:38:31 localhost podman[91802]: 2025-11-23 08:38:31.080719236 +0000 UTC m=+0.140117209 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 23 03:38:31 localhost podman[91804]: 2025-11-23 08:38:31.210694933 +0000 UTC m=+0.263691059 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true) Nov 23 03:38:31 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:38:31 localhost podman[91802]: 2025-11-23 08:38:31.25953237 +0000 UTC m=+0.318930273 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:38:31 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:38:56 localhost systemd[1]: tmp-crun.mZgwQR.mount: Deactivated successfully. Nov 23 03:38:56 localhost podman[91955]: 2025-11-23 08:38:56.052090615 +0000 UTC m=+0.099354807 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64) Nov 23 03:38:56 localhost podman[91954]: 2025-11-23 08:38:56.082500428 +0000 UTC m=+0.131795023 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:38:56 localhost podman[91954]: 2025-11-23 08:38:56.097697235 +0000 UTC m=+0.146991820 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, name=rhosp17/openstack-collectd) Nov 23 03:38:56 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:38:56 localhost podman[91955]: 2025-11-23 08:38:56.181875127 +0000 UTC m=+0.229139339 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:38:56 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:38:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:38:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:38:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:38:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:38:59 localhost systemd[1]: tmp-crun.h7k7EM.mount: Deactivated successfully. Nov 23 03:38:59 localhost podman[91995]: 2025-11-23 08:38:59.043535453 +0000 UTC m=+0.094881171 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com) Nov 23 03:38:59 localhost podman[91996]: 2025-11-23 08:38:59.07798921 +0000 UTC m=+0.128410919 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4) Nov 23 03:38:59 localhost podman[91995]: 2025-11-23 08:38:59.090752702 +0000 UTC m=+0.142098400 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 23 03:38:59 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:38:59 localhost podman[91996]: 2025-11-23 08:38:59.135524025 +0000 UTC m=+0.185945714 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:38:59 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:38:59 localhost podman[91994]: 2025-11-23 08:38:59.136562877 +0000 UTC m=+0.188981217 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1) Nov 23 03:38:59 localhost podman[91997]: 2025-11-23 08:38:59.192346437 +0000 UTC m=+0.240202028 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Nov 23 03:38:59 localhost podman[91997]: 2025-11-23 08:38:59.213011692 +0000 UTC m=+0.260867253 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:38:59 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:38:59 localhost podman[91994]: 2025-11-23 08:38:59.267116911 +0000 UTC m=+0.319535271 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, container_name=logrotate_crond, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron) Nov 23 03:38:59 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:39:00 localhost podman[92091]: 2025-11-23 08:39:00.017001121 +0000 UTC m=+0.076251611 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target) Nov 23 03:39:00 localhost podman[92091]: 2025-11-23 08:39:00.408983253 +0000 UTC m=+0.468233733 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:39:00 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:39:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:39:02 localhost systemd[1]: tmp-crun.mnfebG.mount: Deactivated successfully. Nov 23 03:39:02 localhost systemd[1]: tmp-crun.HPOVpi.mount: Deactivated successfully. Nov 23 03:39:02 localhost podman[92117]: 2025-11-23 08:39:02.092492947 +0000 UTC m=+0.145056429 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:39:02 localhost podman[92118]: 2025-11-23 08:39:02.144467092 +0000 UTC m=+0.196043764 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller) Nov 23 03:39:02 localhost podman[92116]: 2025-11-23 08:39:02.059840576 +0000 UTC m=+0.114899265 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd) Nov 23 03:39:02 localhost podman[92117]: 2025-11-23 08:39:02.173349117 +0000 UTC m=+0.225912559 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.) Nov 23 03:39:02 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:39:02 localhost podman[92118]: 2025-11-23 08:39:02.194436814 +0000 UTC m=+0.246013496 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public) Nov 23 03:39:02 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:39:02 localhost podman[92116]: 2025-11-23 08:39:02.259999805 +0000 UTC m=+0.315058564 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:39:02 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:39:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:39:27 localhost podman[92190]: 2025-11-23 08:39:27.029833527 +0000 UTC m=+0.085448311 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:39:27 localhost podman[92190]: 2025-11-23 08:39:27.065383478 +0000 UTC m=+0.120998252 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:39:27 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:39:27 localhost podman[92191]: 2025-11-23 08:39:27.073842807 +0000 UTC m=+0.125840800 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:39:27 localhost podman[92191]: 2025-11-23 08:39:27.156372538 +0000 UTC m=+0.208370611 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, version=17.1.12, vcs-type=git, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:39:27 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:39:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:39:30 localhost systemd[1]: tmp-crun.DqxTrD.mount: Deactivated successfully. Nov 23 03:39:30 localhost podman[92233]: 2025-11-23 08:39:30.032202259 +0000 UTC m=+0.078372475 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 23 03:39:30 localhost podman[92233]: 2025-11-23 08:39:30.097560883 +0000 UTC m=+0.143731139 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Nov 23 03:39:30 localhost systemd[1]: tmp-crun.p3syHU.mount: Deactivated successfully. Nov 23 03:39:30 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:39:30 localhost podman[92234]: 2025-11-23 08:39:30.145780103 +0000 UTC m=+0.189551145 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 23 03:39:30 localhost podman[92237]: 2025-11-23 08:39:30.101852665 +0000 UTC m=+0.142595244 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:39:30 localhost podman[92232]: 2025-11-23 08:39:30.184927962 +0000 UTC m=+0.236503894 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp17/openstack-cron, vendor=Red Hat, Inc.) Nov 23 03:39:30 localhost podman[92234]: 2025-11-23 08:39:30.202200103 +0000 UTC m=+0.245971105 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible) Nov 23 03:39:30 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:39:30 localhost podman[92232]: 2025-11-23 08:39:30.22105358 +0000 UTC m=+0.272629502 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:39:30 localhost podman[92237]: 2025-11-23 08:39:30.236422962 +0000 UTC m=+0.277165531 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:39:30 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:39:30 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:39:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:39:31 localhost podman[92328]: 2025-11-23 08:39:31.02865715 +0000 UTC m=+0.081711367 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044) Nov 23 03:39:31 localhost podman[92328]: 2025-11-23 08:39:31.403257898 +0000 UTC m=+0.456312125 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:39:31 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:39:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:39:33 localhost systemd[1]: tmp-crun.CmdiHl.mount: Deactivated successfully. Nov 23 03:39:33 localhost systemd[1]: tmp-crun.63lwyq.mount: Deactivated successfully. Nov 23 03:39:33 localhost podman[92353]: 2025-11-23 08:39:33.052374268 +0000 UTC m=+0.105325402 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 03:39:33 localhost podman[92351]: 2025-11-23 08:39:33.02014599 +0000 UTC m=+0.082299816 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:39:33 localhost podman[92353]: 2025-11-23 08:39:33.078342664 +0000 UTC m=+0.131293758 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:39:33 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:39:33 localhost podman[92352]: 2025-11-23 08:39:33.154511551 +0000 UTC m=+0.209822917 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:39:33 localhost podman[92352]: 2025-11-23 08:39:33.197383765 +0000 UTC m=+0.252695161 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:39:33 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:39:33 localhost podman[92351]: 2025-11-23 08:39:33.208090604 +0000 UTC m=+0.270244480 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:39:33 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:39:34 localhost systemd[1]: tmp-crun.qeVNoq.mount: Deactivated successfully. Nov 23 03:39:38 localhost sshd[92425]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:39:50 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:39:50 localhost recover_tripleo_nova_virtqemud[92443]: 61756 Nov 23 03:39:50 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:39:50 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:39:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:39:58 localhost systemd[1]: tmp-crun.ILbzxL.mount: Deactivated successfully. Nov 23 03:39:58 localhost podman[92507]: 2025-11-23 08:39:58.029563236 +0000 UTC m=+0.081459337 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Nov 23 03:39:58 localhost podman[92507]: 2025-11-23 08:39:58.038977945 +0000 UTC m=+0.090874036 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 03:39:58 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:39:58 localhost systemd[1]: tmp-crun.QQG5Sn.mount: Deactivated successfully. Nov 23 03:39:58 localhost podman[92506]: 2025-11-23 08:39:58.091153441 +0000 UTC m=+0.142502625 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.openshift.expose-services=, container_name=collectd) Nov 23 03:39:58 localhost podman[92506]: 2025-11-23 08:39:58.098327902 +0000 UTC m=+0.149677066 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 23 03:39:58 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:40:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:40:01 localhost systemd[1]: tmp-crun.DjXWpP.mount: Deactivated successfully. Nov 23 03:40:01 localhost podman[92547]: 2025-11-23 08:40:01.023498498 +0000 UTC m=+0.077678141 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Nov 23 03:40:01 localhost podman[92547]: 2025-11-23 08:40:01.076048065 +0000 UTC m=+0.130227688 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 03:40:01 localhost systemd[1]: tmp-crun.9UwZs9.mount: Deactivated successfully. Nov 23 03:40:01 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:40:01 localhost podman[92545]: 2025-11-23 08:40:01.07882616 +0000 UTC m=+0.137292714 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:40:01 localhost podman[92548]: 2025-11-23 08:40:01.134464991 +0000 UTC m=+0.186834198 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, version=17.1.12, container_name=nova_compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container) Nov 23 03:40:01 localhost podman[92548]: 2025-11-23 08:40:01.164150155 +0000 UTC m=+0.216519322 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:40:01 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:40:01 localhost podman[92546]: 2025-11-23 08:40:01.180840178 +0000 UTC m=+0.236435534 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, tcib_managed=true) Nov 23 03:40:01 localhost podman[92546]: 2025-11-23 08:40:01.203693391 +0000 UTC m=+0.259288747 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 23 03:40:01 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:40:01 localhost podman[92545]: 2025-11-23 08:40:01.215398051 +0000 UTC m=+0.273864625 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:40:01 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:40:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:40:02 localhost systemd[1]: tmp-crun.670c6r.mount: Deactivated successfully. Nov 23 03:40:02 localhost podman[92639]: 2025-11-23 08:40:02.037746519 +0000 UTC m=+0.094264591 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, tcib_managed=true) Nov 23 03:40:02 localhost podman[92639]: 2025-11-23 08:40:02.452115637 +0000 UTC m=+0.508633709 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:40:02 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:40:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:40:04 localhost podman[92660]: 2025-11-23 08:40:04.035021551 +0000 UTC m=+0.092163686 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:40:04 localhost podman[92661]: 2025-11-23 08:40:04.086139393 +0000 UTC m=+0.142310249 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 23 03:40:04 localhost podman[92662]: 2025-11-23 08:40:04.136501452 +0000 UTC m=+0.188343534 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller) Nov 23 03:40:04 localhost podman[92661]: 2025-11-23 08:40:04.150225335 +0000 UTC m=+0.206396241 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64) Nov 23 03:40:04 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:40:04 localhost podman[92662]: 2025-11-23 08:40:04.206816655 +0000 UTC m=+0.258658767 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 23 03:40:04 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Deactivated successfully. Nov 23 03:40:04 localhost podman[92660]: 2025-11-23 08:40:04.266679477 +0000 UTC m=+0.323821552 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Nov 23 03:40:04 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:40:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:40:29 localhost systemd[1]: tmp-crun.rPL3Gl.mount: Deactivated successfully. Nov 23 03:40:29 localhost podman[92735]: 2025-11-23 08:40:29.022360732 +0000 UTC m=+0.076786693 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, distribution-scope=public, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=) Nov 23 03:40:29 localhost podman[92735]: 2025-11-23 08:40:29.034197756 +0000 UTC m=+0.088623737 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, name=rhosp17/openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Nov 23 03:40:29 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:40:29 localhost systemd[1]: tmp-crun.vJ8Fr1.mount: Deactivated successfully. Nov 23 03:40:29 localhost podman[92734]: 2025-11-23 08:40:29.088039052 +0000 UTC m=+0.144800385 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 23 03:40:29 localhost podman[92734]: 2025-11-23 08:40:29.096480532 +0000 UTC m=+0.153241835 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-collectd) Nov 23 03:40:29 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:40:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:40:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:40:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:40:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:40:32 localhost systemd[1]: tmp-crun.KxR9N6.mount: Deactivated successfully. Nov 23 03:40:32 localhost podman[92773]: 2025-11-23 08:40:32.045531473 +0000 UTC m=+0.100943616 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:40:32 localhost podman[92773]: 2025-11-23 08:40:32.052017012 +0000 UTC m=+0.107429145 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:40:32 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:40:32 localhost podman[92777]: 2025-11-23 08:40:32.093440377 +0000 UTC m=+0.139986097 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z) Nov 23 03:40:32 localhost podman[92775]: 2025-11-23 08:40:32.136666936 +0000 UTC m=+0.184615560 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:40:32 localhost podman[92777]: 2025-11-23 08:40:32.144280531 +0000 UTC m=+0.190826291 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:40:32 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:40:32 localhost podman[92774]: 2025-11-23 08:40:32.193235346 +0000 UTC m=+0.245853043 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12) Nov 23 03:40:32 localhost podman[92775]: 2025-11-23 08:40:32.217325718 +0000 UTC m=+0.265274292 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:40:32 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:40:32 localhost podman[92774]: 2025-11-23 08:40:32.269040788 +0000 UTC m=+0.321658495 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute) Nov 23 03:40:32 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:40:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:40:33 localhost podman[92872]: 2025-11-23 08:40:33.01834259 +0000 UTC m=+0.075547125 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044) Nov 23 03:40:33 localhost podman[92872]: 2025-11-23 08:40:33.355003666 +0000 UTC m=+0.412208161 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:40:33 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:40:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:40:35 localhost podman[92896]: 2025-11-23 08:40:35.029853039 +0000 UTC m=+0.081797267 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:40:35 localhost podman[92895]: 2025-11-23 08:40:35.090158534 +0000 UTC m=+0.144191267 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-type=git) Nov 23 03:40:35 localhost podman[92896]: 2025-11-23 08:40:35.102405491 +0000 UTC m=+0.154349699 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Nov 23 03:40:35 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:40:35 localhost podman[92897]: 2025-11-23 08:40:35.150951024 +0000 UTC m=+0.198782216 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com) Nov 23 03:40:35 localhost podman[92897]: 2025-11-23 08:40:35.187653383 +0000 UTC m=+0.235484575 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 23 03:40:35 localhost podman[92897]: unhealthy Nov 23 03:40:35 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:40:35 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:40:35 localhost podman[92895]: 2025-11-23 08:40:35.320585443 +0000 UTC m=+0.374618186 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:40:35 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:40:36 localhost systemd[1]: tmp-crun.rKTh85.mount: Deactivated successfully. Nov 23 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:40:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:41:00 localhost podman[93101]: 2025-11-23 08:41:00.033706838 +0000 UTC m=+0.083628433 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:41:00 localhost podman[93101]: 2025-11-23 08:41:00.039275819 +0000 UTC m=+0.089197394 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:41:00 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:41:00 localhost podman[93100]: 2025-11-23 08:41:00.078483785 +0000 UTC m=+0.127167152 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container) Nov 23 03:41:00 localhost podman[93100]: 2025-11-23 08:41:00.09128326 +0000 UTC m=+0.139966677 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:41:00 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:41:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:41:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:41:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:41:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:41:03 localhost systemd[1]: tmp-crun.5NDzvE.mount: Deactivated successfully. Nov 23 03:41:03 localhost systemd[1]: tmp-crun.u0dC1O.mount: Deactivated successfully. Nov 23 03:41:03 localhost podman[93139]: 2025-11-23 08:41:03.09166191 +0000 UTC m=+0.146174348 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron) Nov 23 03:41:03 localhost podman[93139]: 2025-11-23 08:41:03.097754397 +0000 UTC m=+0.152266815 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64) Nov 23 03:41:03 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:41:03 localhost podman[93140]: 2025-11-23 08:41:03.139733488 +0000 UTC m=+0.192384529 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:41:03 localhost podman[93142]: 2025-11-23 08:41:03.051638219 +0000 UTC m=+0.100416631 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:41:03 localhost podman[93142]: 2025-11-23 08:41:03.180775541 +0000 UTC m=+0.229553943 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:41:03 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:41:03 localhost podman[93140]: 2025-11-23 08:41:03.191944425 +0000 UTC m=+0.244595466 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Nov 23 03:41:03 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:41:03 localhost podman[93141]: 2025-11-23 08:41:03.186696313 +0000 UTC m=+0.237535568 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 23 03:41:03 localhost podman[93141]: 2025-11-23 08:41:03.270334386 +0000 UTC m=+0.321173691 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:12:45Z, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044) Nov 23 03:41:03 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:41:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:41:04 localhost podman[93237]: 2025-11-23 08:41:04.020121411 +0000 UTC m=+0.075664248 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, version=17.1.12) Nov 23 03:41:04 localhost podman[93237]: 2025-11-23 08:41:04.361616727 +0000 UTC m=+0.417159524 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:41:04 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:41:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:41:06 localhost systemd[1]: tmp-crun.2KcJ25.mount: Deactivated successfully. Nov 23 03:41:06 localhost podman[93260]: 2025-11-23 08:41:06.037158781 +0000 UTC m=+0.095494459 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:41:06 localhost podman[93262]: 2025-11-23 08:41:06.076114389 +0000 UTC m=+0.127755460 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:41:06 localhost podman[93261]: 2025-11-23 08:41:06.134260578 +0000 UTC m=+0.189316025 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 03:41:06 localhost podman[93262]: 2025-11-23 08:41:06.159748442 +0000 UTC m=+0.211389523 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=) Nov 23 03:41:06 localhost podman[93262]: unhealthy Nov 23 03:41:06 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:41:06 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:41:06 localhost podman[93261]: 2025-11-23 08:41:06.207154611 +0000 UTC m=+0.262210068 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 23 03:41:06 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Deactivated successfully. Nov 23 03:41:06 localhost podman[93260]: 2025-11-23 08:41:06.265777414 +0000 UTC m=+0.324113022 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 23 03:41:06 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:41:30 localhost sshd[93338]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:41:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:41:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:41:30 localhost recover_tripleo_nova_virtqemud[93353]: 61756 Nov 23 03:41:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:41:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:41:30 localhost systemd[1]: tmp-crun.YGdY9t.mount: Deactivated successfully. Nov 23 03:41:30 localhost podman[93341]: 2025-11-23 08:41:30.447148049 +0000 UTC m=+0.092934240 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20251118.1) Nov 23 03:41:30 localhost systemd[1]: tmp-crun.LzdEvV.mount: Deactivated successfully. Nov 23 03:41:30 localhost podman[93341]: 2025-11-23 08:41:30.487242792 +0000 UTC m=+0.133028933 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, distribution-scope=public, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 23 03:41:30 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:41:30 localhost podman[93340]: 2025-11-23 08:41:30.491143713 +0000 UTC m=+0.138948746 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:41:30 localhost podman[93340]: 2025-11-23 08:41:30.571219945 +0000 UTC m=+0.219024988 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:41:30 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:41:34 localhost podman[93384]: 2025-11-23 08:41:34.033284208 +0000 UTC m=+0.081594621 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:41:34 localhost podman[93384]: 2025-11-23 08:41:34.055146201 +0000 UTC m=+0.103456604 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 23 03:41:34 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:41:34 localhost podman[93385]: 2025-11-23 08:41:34.127914839 +0000 UTC m=+0.169355111 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Nov 23 03:41:34 localhost podman[93385]: 2025-11-23 08:41:34.148140611 +0000 UTC m=+0.189580863 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z) Nov 23 03:41:34 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:41:34 localhost podman[93389]: 2025-11-23 08:41:34.18610263 +0000 UTC m=+0.224784277 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:41:34 localhost podman[93389]: 2025-11-23 08:41:34.20271624 +0000 UTC m=+0.241397877 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 23 03:41:34 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:41:34 localhost podman[93383]: 2025-11-23 08:41:34.244972851 +0000 UTC m=+0.291412376 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, name=rhosp17/openstack-cron, io.buildah.version=1.41.4) Nov 23 03:41:34 localhost podman[93383]: 2025-11-23 08:41:34.252281395 +0000 UTC m=+0.298720990 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, version=17.1.12) Nov 23 03:41:34 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:41:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:41:35 localhost podman[93482]: 2025-11-23 08:41:35.013498873 +0000 UTC m=+0.074919066 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container) Nov 23 03:41:35 localhost systemd[1]: tmp-crun.aQF0qe.mount: Deactivated successfully. Nov 23 03:41:35 localhost podman[93482]: 2025-11-23 08:41:35.393239744 +0000 UTC m=+0.454659977 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:41:35 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:41:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:41:37 localhost systemd[1]: tmp-crun.weqJQz.mount: Deactivated successfully. Nov 23 03:41:37 localhost podman[93507]: 2025-11-23 08:41:37.091362864 +0000 UTC m=+0.129352471 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:41:37 localhost podman[93505]: 2025-11-23 08:41:37.061763793 +0000 UTC m=+0.106388004 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:41:37 localhost podman[93507]: 2025-11-23 08:41:37.136285186 +0000 UTC m=+0.174274813 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:41:37 localhost podman[93507]: unhealthy Nov 23 03:41:37 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:41:37 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:41:37 localhost podman[93506]: 2025-11-23 08:41:37.144925111 +0000 UTC m=+0.185250039 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:41:37 localhost podman[93506]: 2025-11-23 08:41:37.231262267 +0000 UTC m=+0.271587125 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, batch=17.1_20251118.1) Nov 23 03:41:37 localhost podman[93506]: unhealthy Nov 23 03:41:37 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:41:37 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:41:37 localhost podman[93505]: 2025-11-23 08:41:37.312457345 +0000 UTC m=+0.357081576 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64) Nov 23 03:41:37 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:41:38 localhost systemd[1]: tmp-crun.i0oLgG.mount: Deactivated successfully. Nov 23 03:41:45 localhost sshd[93574]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:42:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:42:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:42:01 localhost systemd[1]: tmp-crun.V3v1P3.mount: Deactivated successfully. Nov 23 03:42:01 localhost podman[93637]: 2025-11-23 08:42:01.035671518 +0000 UTC m=+0.088751682 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3) Nov 23 03:42:01 localhost podman[93637]: 2025-11-23 08:42:01.045157409 +0000 UTC m=+0.098237593 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:42:01 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:42:01 localhost podman[93638]: 2025-11-23 08:42:01.109910401 +0000 UTC m=+0.163154490 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:42:01 localhost podman[93638]: 2025-11-23 08:42:01.121155777 +0000 UTC m=+0.174399886 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Nov 23 03:42:01 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:42:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:42:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:42:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:42:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:42:05 localhost podman[93695]: 2025-11-23 08:42:05.02203042 +0000 UTC m=+0.068632123 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git) Nov 23 03:42:05 localhost podman[93692]: 2025-11-23 08:42:05.082751388 +0000 UTC m=+0.129000949 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 23 03:42:05 localhost podman[93695]: 2025-11-23 08:42:05.098013627 +0000 UTC m=+0.144615380 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:42:05 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:42:05 localhost podman[93692]: 2025-11-23 08:42:05.111269755 +0000 UTC m=+0.157519346 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com) Nov 23 03:42:05 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:42:05 localhost podman[93690]: 2025-11-23 08:42:05.194442093 +0000 UTC m=+0.247133113 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-cron-container) Nov 23 03:42:05 localhost podman[93690]: 2025-11-23 08:42:05.23137462 +0000 UTC m=+0.284065670 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc.) Nov 23 03:42:05 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:42:05 localhost podman[93691]: 2025-11-23 08:42:05.24566737 +0000 UTC m=+0.295162552 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:42:05 localhost podman[93691]: 2025-11-23 08:42:05.302384844 +0000 UTC m=+0.351880016 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:42:05 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:42:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:42:06 localhost systemd[1]: tmp-crun.A1BgES.mount: Deactivated successfully. Nov 23 03:42:06 localhost podman[93787]: 2025-11-23 08:42:06.022644122 +0000 UTC m=+0.076109113 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:42:06 localhost podman[93787]: 2025-11-23 08:42:06.386287318 +0000 UTC m=+0.439752269 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Nov 23 03:42:06 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:42:08 localhost systemd[1]: tmp-crun.V1MuL1.mount: Deactivated successfully. Nov 23 03:42:08 localhost podman[93810]: 2025-11-23 08:42:08.03511766 +0000 UTC m=+0.086637585 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 23 03:42:08 localhost podman[93809]: 2025-11-23 08:42:08.105533757 +0000 UTC m=+0.156661920 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:42:08 localhost podman[93810]: 2025-11-23 08:42:08.131542447 +0000 UTC m=+0.183062402 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:42:08 localhost podman[93810]: unhealthy Nov 23 03:42:08 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:42:08 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:42:08 localhost podman[93811]: 2025-11-23 08:42:08.198465656 +0000 UTC m=+0.243942005 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:42:08 localhost podman[93811]: 2025-11-23 08:42:08.241394386 +0000 UTC m=+0.286870745 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:42:08 localhost podman[93811]: unhealthy Nov 23 03:42:08 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:42:08 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:42:08 localhost podman[93809]: 2025-11-23 08:42:08.322646756 +0000 UTC m=+0.373774849 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:42:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:42:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:42:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:42:31 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:42:31 localhost recover_tripleo_nova_virtqemud[93889]: 61756 Nov 23 03:42:31 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:42:31 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:42:32 localhost systemd[1]: tmp-crun.hUTLgH.mount: Deactivated successfully. Nov 23 03:42:32 localhost podman[93877]: 2025-11-23 08:42:32.013926075 +0000 UTC m=+0.072286905 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:42:32 localhost podman[93877]: 2025-11-23 08:42:32.023114337 +0000 UTC m=+0.081475217 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Nov 23 03:42:32 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:42:32 localhost podman[93876]: 2025-11-23 08:42:32.090683516 +0000 UTC m=+0.147740915 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 23 03:42:32 localhost podman[93876]: 2025-11-23 08:42:32.102226602 +0000 UTC m=+0.159284041 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:42:32 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:42:34 localhost sshd[93915]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:42:34 localhost sshd[93916]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:42:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:42:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:42:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:42:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:42:36 localhost systemd[1]: tmp-crun.6nE2KU.mount: Deactivated successfully. Nov 23 03:42:36 localhost podman[93918]: 2025-11-23 08:42:36.047864949 +0000 UTC m=+0.100718469 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:42:36 localhost podman[93917]: 2025-11-23 08:42:36.103295915 +0000 UTC m=+0.156114074 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 23 03:42:36 localhost podman[93917]: 2025-11-23 08:42:36.110157915 +0000 UTC m=+0.162976034 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, release=1761123044, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:42:36 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:42:36 localhost podman[93919]: 2025-11-23 08:42:36.077999827 +0000 UTC m=+0.132176467 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:42:36 localhost podman[93919]: 2025-11-23 08:42:36.158237424 +0000 UTC m=+0.212414004 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4) Nov 23 03:42:36 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:42:36 localhost podman[93918]: 2025-11-23 08:42:36.179602912 +0000 UTC m=+0.232456432 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z) Nov 23 03:42:36 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:42:36 localhost podman[93920]: 2025-11-23 08:42:36.233739507 +0000 UTC m=+0.281643725 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:42:36 localhost podman[93920]: 2025-11-23 08:42:36.276339078 +0000 UTC m=+0.324243356 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute) Nov 23 03:42:36 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:42:37 localhost podman[94013]: 2025-11-23 08:42:37.023170572 +0000 UTC m=+0.079869257 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:42:37 localhost podman[94013]: 2025-11-23 08:42:37.431272027 +0000 UTC m=+0.487970652 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 03:42:37 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:42:39 localhost podman[94037]: 2025-11-23 08:42:39.026438358 +0000 UTC m=+0.076808473 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 03:42:39 localhost podman[94037]: 2025-11-23 08:42:39.044495285 +0000 UTC m=+0.094865350 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64) Nov 23 03:42:39 localhost podman[94037]: unhealthy Nov 23 03:42:39 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:42:39 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:42:39 localhost podman[94036]: 2025-11-23 08:42:39.131408638 +0000 UTC m=+0.185795707 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64) Nov 23 03:42:39 localhost podman[94036]: 2025-11-23 08:42:39.145380448 +0000 UTC m=+0.199767507 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044) Nov 23 03:42:39 localhost podman[94036]: unhealthy Nov 23 03:42:39 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:42:39 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:42:39 localhost podman[94035]: 2025-11-23 08:42:39.234584252 +0000 UTC m=+0.292311234 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git) Nov 23 03:42:39 localhost podman[94035]: 2025-11-23 08:42:39.454366193 +0000 UTC m=+0.512093155 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 23 03:42:39 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:43:03 localhost systemd[1]: tmp-crun.0Ki2Jy.mount: Deactivated successfully. Nov 23 03:43:03 localhost podman[94104]: 2025-11-23 08:43:03.034011809 +0000 UTC m=+0.087132841 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, container_name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:43:03 localhost podman[94104]: 2025-11-23 08:43:03.070245314 +0000 UTC m=+0.123366326 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4) Nov 23 03:43:03 localhost podman[94103]: 2025-11-23 08:43:03.089982431 +0000 UTC m=+0.144666062 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Nov 23 03:43:03 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:43:03 localhost podman[94103]: 2025-11-23 08:43:03.106249241 +0000 UTC m=+0.160932862 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, release=1761123044, architecture=x86_64, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:43:03 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:43:04 localhost podman[94272]: Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.623166716 +0000 UTC m=+0.073000047 container create 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.595585098 +0000 UTC m=+0.045418449 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 03:43:04 localhost systemd[1]: Started libpod-conmon-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope. Nov 23 03:43:04 localhost systemd[1]: Started libcrun container. Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.746255803 +0000 UTC m=+0.196089134 container init 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, release=553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, name=rhceph, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.756967922 +0000 UTC m=+0.206801263 container start 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, architecture=x86_64, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.757388505 +0000 UTC m=+0.207221876 container attach 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 03:43:04 localhost stupefied_keldysh[94288]: 167 167 Nov 23 03:43:04 localhost systemd[1]: libpod-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope: Deactivated successfully. Nov 23 03:43:04 localhost podman[94272]: 2025-11-23 08:43:04.761082409 +0000 UTC m=+0.210915760 container died 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64) Nov 23 03:43:04 localhost podman[94293]: 2025-11-23 08:43:04.856524745 +0000 UTC m=+0.081982284 container remove 4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_keldysh, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Nov 23 03:43:04 localhost systemd[1]: libpod-conmon-4e0742e7dc13c432c806c08abbb5e3b0687813acdf1eeb058a7d2aba3d943de2.scope: Deactivated successfully. Nov 23 03:43:05 localhost podman[94314]: Nov 23 03:43:05 localhost podman[94314]: 2025-11-23 08:43:05.069052503 +0000 UTC m=+0.067143777 container create 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 03:43:05 localhost systemd[1]: Started libpod-conmon-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope. Nov 23 03:43:05 localhost systemd[1]: Started libcrun container. Nov 23 03:43:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 03:43:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 03:43:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 03:43:05 localhost podman[94314]: 2025-11-23 08:43:05.134758014 +0000 UTC m=+0.132849298 container init 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, architecture=x86_64, release=553, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=) Nov 23 03:43:05 localhost podman[94314]: 2025-11-23 08:43:05.037594525 +0000 UTC m=+0.035685839 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 03:43:05 localhost podman[94314]: 2025-11-23 08:43:05.144517835 +0000 UTC m=+0.142609109 container start 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Nov 23 03:43:05 localhost podman[94314]: 2025-11-23 08:43:05.144787273 +0000 UTC m=+0.142878617 container attach 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, name=rhceph, architecture=x86_64, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, release=553) Nov 23 03:43:05 localhost systemd[1]: var-lib-containers-storage-overlay-5458e3939bc568714f5f971b053d115fd385613127ac957a70b6a71eb4798fc7-merged.mount: Deactivated successfully. Nov 23 03:43:06 localhost priceless_wing[94330]: [ Nov 23 03:43:06 localhost priceless_wing[94330]: { Nov 23 03:43:06 localhost priceless_wing[94330]: "available": false, Nov 23 03:43:06 localhost priceless_wing[94330]: "ceph_device": false, Nov 23 03:43:06 localhost priceless_wing[94330]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 03:43:06 localhost priceless_wing[94330]: "lsm_data": {}, Nov 23 03:43:06 localhost priceless_wing[94330]: "lvs": [], Nov 23 03:43:06 localhost priceless_wing[94330]: "path": "/dev/sr0", Nov 23 03:43:06 localhost priceless_wing[94330]: "rejected_reasons": [ Nov 23 03:43:06 localhost priceless_wing[94330]: "Insufficient space (<5GB)", Nov 23 03:43:06 localhost priceless_wing[94330]: "Has a FileSystem" Nov 23 03:43:06 localhost priceless_wing[94330]: ], Nov 23 03:43:06 localhost priceless_wing[94330]: "sys_api": { Nov 23 03:43:06 localhost priceless_wing[94330]: "actuators": null, Nov 23 03:43:06 localhost priceless_wing[94330]: "device_nodes": "sr0", Nov 23 03:43:06 localhost priceless_wing[94330]: "human_readable_size": "482.00 KB", Nov 23 03:43:06 localhost priceless_wing[94330]: "id_bus": "ata", Nov 23 03:43:06 localhost priceless_wing[94330]: "model": "QEMU DVD-ROM", Nov 23 03:43:06 localhost priceless_wing[94330]: "nr_requests": "2", Nov 23 03:43:06 localhost priceless_wing[94330]: "partitions": {}, Nov 23 03:43:06 localhost priceless_wing[94330]: "path": "/dev/sr0", Nov 23 03:43:06 localhost priceless_wing[94330]: "removable": "1", Nov 23 03:43:06 localhost priceless_wing[94330]: "rev": "2.5+", Nov 23 03:43:06 localhost priceless_wing[94330]: "ro": "0", Nov 23 03:43:06 localhost priceless_wing[94330]: "rotational": "1", Nov 23 03:43:06 localhost priceless_wing[94330]: "sas_address": "", Nov 23 03:43:06 localhost priceless_wing[94330]: "sas_device_handle": "", Nov 23 03:43:06 localhost priceless_wing[94330]: "scheduler_mode": "mq-deadline", Nov 23 03:43:06 localhost priceless_wing[94330]: "sectors": 0, Nov 23 03:43:06 localhost priceless_wing[94330]: "sectorsize": "2048", Nov 23 03:43:06 localhost priceless_wing[94330]: "size": 493568.0, Nov 23 03:43:06 localhost priceless_wing[94330]: "support_discard": "0", Nov 23 03:43:06 localhost priceless_wing[94330]: "type": "disk", Nov 23 03:43:06 localhost priceless_wing[94330]: "vendor": "QEMU" Nov 23 03:43:06 localhost priceless_wing[94330]: } Nov 23 03:43:06 localhost priceless_wing[94330]: } Nov 23 03:43:06 localhost priceless_wing[94330]: ] Nov 23 03:43:06 localhost systemd[1]: libpod-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Deactivated successfully. Nov 23 03:43:06 localhost systemd[1]: libpod-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Consumed 1.041s CPU time. Nov 23 03:43:06 localhost podman[94314]: 2025-11-23 08:43:06.161247911 +0000 UTC m=+1.159339175 container died 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, vendor=Red Hat, Inc., release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Nov 23 03:43:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:43:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:43:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:43:06 localhost systemd[1]: var-lib-containers-storage-overlay-a4c4c1c09313aa86fa93f58fae073da26b52fe973d8a4aeabe14717e0714870c-merged.mount: Deactivated successfully. Nov 23 03:43:06 localhost podman[96444]: 2025-11-23 08:43:06.250072814 +0000 UTC m=+0.081083946 container remove 443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_wing, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 03:43:06 localhost systemd[1]: libpod-conmon-443bf8ba85ccc0ce3173dbd6d7a2d97577296bee11bc2bdd354807b11c88ff79.scope: Deactivated successfully. Nov 23 03:43:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:43:06 localhost podman[96454]: 2025-11-23 08:43:06.325831784 +0000 UTC m=+0.129261637 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:43:06 localhost podman[96452]: 2025-11-23 08:43:06.329829278 +0000 UTC m=+0.133015753 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12) Nov 23 03:43:06 localhost podman[96452]: 2025-11-23 08:43:06.335730169 +0000 UTC m=+0.138916614 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044) Nov 23 03:43:06 localhost podman[96454]: 2025-11-23 08:43:06.344951373 +0000 UTC m=+0.148381256 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public) Nov 23 03:43:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:43:06 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:43:06 localhost podman[96499]: 2025-11-23 08:43:06.405508526 +0000 UTC m=+0.070576093 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Nov 23 03:43:06 localhost podman[96499]: 2025-11-23 08:43:06.425775039 +0000 UTC m=+0.090842606 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 23 03:43:06 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:43:06 localhost podman[96476]: 2025-11-23 08:43:06.473313882 +0000 UTC m=+0.224838718 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:43:06 localhost podman[96476]: 2025-11-23 08:43:06.491630275 +0000 UTC m=+0.243155021 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Nov 23 03:43:06 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:43:06 localhost systemd[1]: tmp-crun.w0MmA0.mount: Deactivated successfully. Nov 23 03:43:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:43:08 localhost systemd[1]: tmp-crun.zoNRhs.mount: Deactivated successfully. Nov 23 03:43:08 localhost podman[96571]: 2025-11-23 08:43:08.020976412 +0000 UTC m=+0.079205548 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12) Nov 23 03:43:08 localhost podman[96571]: 2025-11-23 08:43:08.390305553 +0000 UTC m=+0.448534719 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:43:08 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:43:10 localhost podman[96593]: 2025-11-23 08:43:10.030071288 +0000 UTC m=+0.088622188 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:43:10 localhost systemd[1]: tmp-crun.TuYjPP.mount: Deactivated successfully. Nov 23 03:43:10 localhost podman[96600]: 2025-11-23 08:43:10.140818244 +0000 UTC m=+0.186870750 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public) Nov 23 03:43:10 localhost podman[96600]: 2025-11-23 08:43:10.154155025 +0000 UTC m=+0.200207551 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:43:10 localhost podman[96600]: unhealthy Nov 23 03:43:10 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:43:10 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:43:10 localhost podman[96594]: 2025-11-23 08:43:10.11826808 +0000 UTC m=+0.168210285 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:43:10 localhost podman[96594]: 2025-11-23 08:43:10.197990912 +0000 UTC m=+0.247933117 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 23 03:43:10 localhost podman[96594]: unhealthy Nov 23 03:43:10 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:43:10 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:43:10 localhost podman[96593]: 2025-11-23 08:43:10.255741789 +0000 UTC m=+0.314292669 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64) Nov 23 03:43:10 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:43:11 localhost systemd[1]: tmp-crun.r6xsUY.mount: Deactivated successfully. Nov 23 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:43:34 localhost podman[96662]: 2025-11-23 08:43:34.044668274 +0000 UTC m=+0.097960395 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1) Nov 23 03:43:34 localhost podman[96662]: 2025-11-23 08:43:34.05625189 +0000 UTC m=+0.109544001 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:43:34 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:43:34 localhost podman[96663]: 2025-11-23 08:43:34.143060781 +0000 UTC m=+0.191616636 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044) Nov 23 03:43:34 localhost podman[96663]: 2025-11-23 08:43:34.156269187 +0000 UTC m=+0.204825052 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:43:34 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:43:37 localhost podman[96701]: 2025-11-23 08:43:37.054262317 +0000 UTC m=+0.097927204 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=logrotate_crond, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:43:37 localhost podman[96701]: 2025-11-23 08:43:37.092620197 +0000 UTC m=+0.136285084 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 23 03:43:37 localhost podman[96702]: 2025-11-23 08:43:37.106621088 +0000 UTC m=+0.146334352 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:43:37 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:43:37 localhost podman[96704]: 2025-11-23 08:43:37.16192448 +0000 UTC m=+0.194423313 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public) Nov 23 03:43:37 localhost podman[96702]: 2025-11-23 08:43:37.189329632 +0000 UTC m=+0.229042836 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 23 03:43:37 localhost podman[96703]: 2025-11-23 08:43:37.216106726 +0000 UTC m=+0.253288992 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:43:37 localhost podman[96704]: 2025-11-23 08:43:37.219310634 +0000 UTC m=+0.251809527 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:43:37 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:43:37 localhost podman[96703]: 2025-11-23 08:43:37.248340027 +0000 UTC m=+0.285522333 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:43:37 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:43:37 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:43:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:43:39 localhost systemd[1]: tmp-crun.ebH6Rh.mount: Deactivated successfully. Nov 23 03:43:39 localhost podman[96802]: 2025-11-23 08:43:39.030348738 +0000 UTC m=+0.087543725 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:43:39 localhost podman[96802]: 2025-11-23 08:43:39.469248329 +0000 UTC m=+0.526443346 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z) Nov 23 03:43:39 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:43:40 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:43:40 localhost recover_tripleo_nova_virtqemud[96844]: 61756 Nov 23 03:43:40 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:43:40 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:43:41 localhost systemd[1]: tmp-crun.BV2T1r.mount: Deactivated successfully. Nov 23 03:43:41 localhost podman[96827]: 2025-11-23 08:43:41.040188066 +0000 UTC m=+0.083385067 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:43:41 localhost systemd[1]: tmp-crun.Rx4fON.mount: Deactivated successfully. Nov 23 03:43:41 localhost podman[96827]: 2025-11-23 08:43:41.086417648 +0000 UTC m=+0.129614639 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z) Nov 23 03:43:41 localhost podman[96825]: 2025-11-23 08:43:41.089342988 +0000 UTC m=+0.139812122 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Nov 23 03:43:41 localhost podman[96826]: 2025-11-23 08:43:41.124920022 +0000 UTC m=+0.171363113 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, distribution-scope=public) Nov 23 03:43:41 localhost podman[96827]: unhealthy Nov 23 03:43:41 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:43:41 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:43:41 localhost podman[96826]: 2025-11-23 08:43:41.191739138 +0000 UTC m=+0.238182249 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1) Nov 23 03:43:41 localhost podman[96826]: unhealthy Nov 23 03:43:41 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:43:41 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:43:41 localhost podman[96825]: 2025-11-23 08:43:41.260527094 +0000 UTC m=+0.310996208 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:43:41 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:43:54 localhost sshd[96895]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:44:05 localhost podman[96897]: 2025-11-23 08:44:05.048118945 +0000 UTC m=+0.099027847 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:44:05 localhost podman[96897]: 2025-11-23 08:44:05.055376509 +0000 UTC m=+0.106285411 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_id=tripleo_step3) Nov 23 03:44:05 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:44:05 localhost podman[96898]: 2025-11-23 08:44:05.138453224 +0000 UTC m=+0.189535002 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z) Nov 23 03:44:05 localhost podman[96898]: 2025-11-23 08:44:05.148326088 +0000 UTC m=+0.199407876 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:44:05 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:44:07 localhost podman[96949]: 2025-11-23 08:44:07.671717865 +0000 UTC m=+0.096007865 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 23 03:44:07 localhost podman[96948]: 2025-11-23 08:44:07.726626654 +0000 UTC m=+0.154447452 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:44:07 localhost podman[96949]: 2025-11-23 08:44:07.730303667 +0000 UTC m=+0.154593627 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team) Nov 23 03:44:07 localhost podman[96948]: 2025-11-23 08:44:07.739225612 +0000 UTC m=+0.167046390 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:44:07 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:44:07 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:44:07 localhost podman[96950]: 2025-11-23 08:44:07.836615747 +0000 UTC m=+0.258232164 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:44:07 localhost podman[96952]: 2025-11-23 08:44:07.884469799 +0000 UTC m=+0.302377022 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 23 03:44:07 localhost podman[96950]: 2025-11-23 08:44:07.901515015 +0000 UTC m=+0.323131402 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:44:07 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:44:07 localhost podman[96952]: 2025-11-23 08:44:07.915193825 +0000 UTC m=+0.333100998 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:44:07 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:44:08 localhost systemd[1]: tmp-crun.Lm25ff.mount: Deactivated successfully. Nov 23 03:44:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:44:09 localhost podman[97152]: 2025-11-23 08:44:09.687876127 +0000 UTC m=+0.074923835 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 23 03:44:10 localhost podman[97152]: 2025-11-23 08:44:10.096290832 +0000 UTC m=+0.483338510 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 23 03:44:10 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:44:12 localhost systemd[1]: tmp-crun.hTgjrD.mount: Deactivated successfully. Nov 23 03:44:12 localhost podman[97176]: 2025-11-23 08:44:12.034319661 +0000 UTC m=+0.092328321 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, config_id=tripleo_step1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:44:12 localhost podman[97178]: 2025-11-23 08:44:12.082370599 +0000 UTC m=+0.135590232 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Nov 23 03:44:12 localhost podman[97178]: 2025-11-23 08:44:12.096641948 +0000 UTC m=+0.149861601 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller) Nov 23 03:44:12 localhost podman[97178]: unhealthy Nov 23 03:44:12 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:44:12 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:44:12 localhost podman[97177]: 2025-11-23 08:44:12.180715194 +0000 UTC m=+0.236278329 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:44:12 localhost podman[97176]: 2025-11-23 08:44:12.215245626 +0000 UTC m=+0.273254336 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:44:12 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:44:12 localhost podman[97177]: 2025-11-23 08:44:12.272977723 +0000 UTC m=+0.328540868 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:44:12 localhost podman[97177]: unhealthy Nov 23 03:44:12 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:44:12 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:44:35 localhost podman[97243]: 2025-11-23 08:44:35.992453089 +0000 UTC m=+0.054204188 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, container_name=collectd, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=) Nov 23 03:44:36 localhost podman[97243]: 2025-11-23 08:44:36.002149288 +0000 UTC m=+0.063900397 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12) Nov 23 03:44:36 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:44:36 localhost podman[97244]: 2025-11-23 08:44:36.058337656 +0000 UTC m=+0.116076232 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, name=rhosp17/openstack-iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:44:36 localhost podman[97244]: 2025-11-23 08:44:36.097263183 +0000 UTC m=+0.155001789 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:44:36 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:44:38 localhost systemd[1]: tmp-crun.MGs51J.mount: Deactivated successfully. Nov 23 03:44:38 localhost podman[97291]: 2025-11-23 08:44:38.026349007 +0000 UTC m=+0.072802201 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true) Nov 23 03:44:38 localhost podman[97291]: 2025-11-23 08:44:38.076377056 +0000 UTC m=+0.122830320 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:44:38 localhost systemd[1]: tmp-crun.1A097M.mount: Deactivated successfully. Nov 23 03:44:38 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:44:38 localhost podman[97284]: 2025-11-23 08:44:38.088518269 +0000 UTC m=+0.138495561 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 03:44:38 localhost podman[97283]: 2025-11-23 08:44:38.150994462 +0000 UTC m=+0.204484702 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:44:38 localhost podman[97283]: 2025-11-23 08:44:38.161127823 +0000 UTC m=+0.214618093 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com) Nov 23 03:44:38 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:44:38 localhost podman[97284]: 2025-11-23 08:44:38.172128582 +0000 UTC m=+0.222105874 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Nov 23 03:44:38 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:44:38 localhost podman[97285]: 2025-11-23 08:44:38.141606632 +0000 UTC m=+0.187396845 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12) Nov 23 03:44:38 localhost podman[97285]: 2025-11-23 08:44:38.225508084 +0000 UTC m=+0.271298357 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, vcs-type=git) Nov 23 03:44:38 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:44:39 localhost systemd[1]: tmp-crun.nCzh6z.mount: Deactivated successfully. Nov 23 03:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:44:41 localhost systemd[1]: tmp-crun.Zw6zDn.mount: Deactivated successfully. Nov 23 03:44:41 localhost podman[97377]: 2025-11-23 08:44:41.023379024 +0000 UTC m=+0.073659416 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:44:41 localhost podman[97377]: 2025-11-23 08:44:41.389319082 +0000 UTC m=+0.439599504 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible) Nov 23 03:44:41 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:44:42 localhost sshd[97402]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:44:43 localhost podman[97404]: 2025-11-23 08:44:43.032994525 +0000 UTC m=+0.091965161 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:44:43 localhost systemd[1]: tmp-crun.pvjTiP.mount: Deactivated successfully. Nov 23 03:44:43 localhost podman[97405]: 2025-11-23 08:44:43.081844368 +0000 UTC m=+0.137579633 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044) Nov 23 03:44:43 localhost podman[97405]: 2025-11-23 08:44:43.126518272 +0000 UTC m=+0.182253557 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 23 03:44:43 localhost podman[97405]: unhealthy Nov 23 03:44:43 localhost podman[97406]: 2025-11-23 08:44:43.138646775 +0000 UTC m=+0.191040507 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller) Nov 23 03:44:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:44:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:44:43 localhost podman[97406]: 2025-11-23 08:44:43.160372614 +0000 UTC m=+0.212766396 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:44:43 localhost podman[97406]: unhealthy Nov 23 03:44:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:44:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:44:43 localhost podman[97404]: 2025-11-23 08:44:43.219376109 +0000 UTC m=+0.278346705 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:44:43 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:45:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:45:06 localhost recover_tripleo_nova_virtqemud[97488]: 61756 Nov 23 03:45:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:45:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:45:07 localhost podman[97475]: 2025-11-23 08:45:07.034643863 +0000 UTC m=+0.092943470 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:45:07 localhost podman[97475]: 2025-11-23 08:45:07.073294012 +0000 UTC m=+0.131593629 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:45:07 localhost systemd[1]: tmp-crun.DmpLfx.mount: Deactivated successfully. Nov 23 03:45:07 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:45:07 localhost podman[97476]: 2025-11-23 08:45:07.086947693 +0000 UTC m=+0.141370011 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:45:07 localhost podman[97476]: 2025-11-23 08:45:07.094293238 +0000 UTC m=+0.148715536 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, version=17.1.12) Nov 23 03:45:07 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:45:09 localhost podman[97516]: 2025-11-23 08:45:09.018519733 +0000 UTC m=+0.075890786 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, vcs-type=git, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:45:09 localhost podman[97516]: 2025-11-23 08:45:09.044198312 +0000 UTC m=+0.101569365 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z) Nov 23 03:45:09 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:45:09 localhost podman[97517]: 2025-11-23 08:45:09.079598072 +0000 UTC m=+0.133756616 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:45:09 localhost podman[97515]: 2025-11-23 08:45:09.128114534 +0000 UTC m=+0.187167258 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:45:09 localhost podman[97517]: 2025-11-23 08:45:09.133279893 +0000 UTC m=+0.187438457 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true) Nov 23 03:45:09 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:45:09 localhost podman[97514]: 2025-11-23 08:45:09.180810805 +0000 UTC m=+0.236742693 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:45:09 localhost podman[97515]: 2025-11-23 08:45:09.204609988 +0000 UTC m=+0.263662652 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Nov 23 03:45:09 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:45:09 localhost podman[97514]: 2025-11-23 08:45:09.217336039 +0000 UTC m=+0.273268007 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc.) Nov 23 03:45:09 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:45:12 localhost podman[97692]: 2025-11-23 08:45:12.011266308 +0000 UTC m=+0.070160950 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:45:12 localhost podman[97692]: 2025-11-23 08:45:12.397313183 +0000 UTC m=+0.456207785 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:45:12 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:45:14 localhost systemd[1]: tmp-crun.ysB0Cv.mount: Deactivated successfully. Nov 23 03:45:14 localhost podman[97718]: 2025-11-23 08:45:14.083317849 +0000 UTC m=+0.140098721 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:45:14 localhost podman[97718]: 2025-11-23 08:45:14.095541826 +0000 UTC m=+0.152322678 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:45:14 localhost podman[97718]: unhealthy Nov 23 03:45:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:45:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:45:14 localhost podman[97719]: 2025-11-23 08:45:14.013084679 +0000 UTC m=+0.064493005 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=) Nov 23 03:45:14 localhost podman[97719]: 2025-11-23 08:45:14.146444152 +0000 UTC m=+0.197852598 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible) Nov 23 03:45:14 localhost podman[97719]: unhealthy Nov 23 03:45:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:45:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:45:14 localhost podman[97717]: 2025-11-23 08:45:14.048971943 +0000 UTC m=+0.104894119 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=metrics_qdr) Nov 23 03:45:14 localhost podman[97717]: 2025-11-23 08:45:14.260544181 +0000 UTC m=+0.316466367 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:45:14 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:45:15 localhost systemd[1]: tmp-crun.MkHjYp.mount: Deactivated successfully. Nov 23 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:45:38 localhost podman[97787]: 2025-11-23 08:45:38.020646018 +0000 UTC m=+0.071139059 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, architecture=x86_64, container_name=iscsid, com.redhat.component=openstack-iscsid-container) Nov 23 03:45:38 localhost podman[97787]: 2025-11-23 08:45:38.032324417 +0000 UTC m=+0.082817488 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1) Nov 23 03:45:38 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:45:38 localhost podman[97786]: 2025-11-23 08:45:38.075665221 +0000 UTC m=+0.126810553 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Nov 23 03:45:38 localhost podman[97786]: 2025-11-23 08:45:38.086201875 +0000 UTC m=+0.137347127 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:45:38 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:45:40 localhost systemd[1]: tmp-crun.fneATk.mount: Deactivated successfully. Nov 23 03:45:40 localhost podman[97825]: 2025-11-23 08:45:40.03708275 +0000 UTC m=+0.090992511 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=logrotate_crond, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4) Nov 23 03:45:40 localhost podman[97825]: 2025-11-23 08:45:40.078251866 +0000 UTC m=+0.132161627 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true) Nov 23 03:45:40 localhost systemd[1]: tmp-crun.JetZ4J.mount: Deactivated successfully. Nov 23 03:45:40 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:45:40 localhost podman[97826]: 2025-11-23 08:45:40.133451304 +0000 UTC m=+0.184441125 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:45:40 localhost podman[97826]: 2025-11-23 08:45:40.158228647 +0000 UTC m=+0.209218428 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:45:40 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:45:40 localhost podman[97827]: 2025-11-23 08:45:40.084076236 +0000 UTC m=+0.133209999 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:45:40 localhost podman[97827]: 2025-11-23 08:45:40.221334988 +0000 UTC m=+0.270468811 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 23 03:45:40 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:45:40 localhost podman[97828]: 2025-11-23 08:45:40.236539116 +0000 UTC m=+0.282216653 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:45:40 localhost podman[97828]: 2025-11-23 08:45:40.26462626 +0000 UTC m=+0.310303787 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:45:40 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:45:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:45:43 localhost podman[97916]: 2025-11-23 08:45:43.010300705 +0000 UTC m=+0.070254472 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:45:43 localhost podman[97916]: 2025-11-23 08:45:43.393297927 +0000 UTC m=+0.453251694 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 23 03:45:43 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:45:45 localhost systemd[1]: tmp-crun.n3mz8a.mount: Deactivated successfully. Nov 23 03:45:45 localhost podman[97941]: 2025-11-23 08:45:45.024192328 +0000 UTC m=+0.080610311 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:45:45 localhost podman[97942]: 2025-11-23 08:45:45.079239191 +0000 UTC m=+0.134189419 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 23 03:45:45 localhost podman[97942]: 2025-11-23 08:45:45.090555239 +0000 UTC m=+0.145505497 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc.) Nov 23 03:45:45 localhost podman[97942]: unhealthy Nov 23 03:45:45 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:45:45 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:45:45 localhost systemd[1]: tmp-crun.dScank.mount: Deactivated successfully. Nov 23 03:45:45 localhost podman[97943]: 2025-11-23 08:45:45.141123615 +0000 UTC m=+0.192945207 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:45:45 localhost podman[97943]: 2025-11-23 08:45:45.148304675 +0000 UTC m=+0.200126267 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vcs-type=git) Nov 23 03:45:45 localhost podman[97943]: unhealthy Nov 23 03:45:45 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:45:45 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:45:45 localhost podman[97941]: 2025-11-23 08:45:45.239245463 +0000 UTC m=+0.295663446 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:45:45 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:46:09 localhost systemd[1]: tmp-crun.uibBqM.mount: Deactivated successfully. Nov 23 03:46:09 localhost podman[98011]: 2025-11-23 08:46:09.019129779 +0000 UTC m=+0.081550980 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 23 03:46:09 localhost systemd[1]: tmp-crun.gsRQgT.mount: Deactivated successfully. Nov 23 03:46:09 localhost podman[98011]: 2025-11-23 08:46:09.035174493 +0000 UTC m=+0.097595684 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public) Nov 23 03:46:09 localhost podman[98012]: 2025-11-23 08:46:09.0422195 +0000 UTC m=+0.097889562 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, tcib_managed=true) Nov 23 03:46:09 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:46:09 localhost podman[98012]: 2025-11-23 08:46:09.053169846 +0000 UTC m=+0.108839908 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:46:09 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:46:11 localhost systemd[1]: tmp-crun.zvAFiR.mount: Deactivated successfully. Nov 23 03:46:11 localhost podman[98052]: 2025-11-23 08:46:11.004855936 +0000 UTC m=+0.063301378 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:46:11 localhost systemd[1]: tmp-crun.PuCOO3.mount: Deactivated successfully. Nov 23 03:46:11 localhost podman[98053]: 2025-11-23 08:46:11.047129426 +0000 UTC m=+0.098634495 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1) Nov 23 03:46:11 localhost podman[98052]: 2025-11-23 08:46:11.06121222 +0000 UTC m=+0.119657702 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, version=17.1.12, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-type=git) Nov 23 03:46:11 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:46:11 localhost podman[98064]: 2025-11-23 08:46:11.10151186 +0000 UTC m=+0.147926702 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:46:11 localhost podman[98051]: 2025-11-23 08:46:11.065798261 +0000 UTC m=+0.123991765 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:46:11 localhost podman[98064]: 2025-11-23 08:46:11.124198778 +0000 UTC m=+0.170613620 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:46:11 localhost podman[98053]: 2025-11-23 08:46:11.130176621 +0000 UTC m=+0.181681700 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12) Nov 23 03:46:11 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:46:11 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:46:11 localhost podman[98051]: 2025-11-23 08:46:11.145357648 +0000 UTC m=+0.203551202 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public) Nov 23 03:46:11 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:46:12 localhost podman[98247]: 2025-11-23 08:46:12.215526459 +0000 UTC m=+0.085232992 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, maintainer=Guillaume Abrioux , ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, architecture=x86_64) Nov 23 03:46:12 localhost podman[98247]: 2025-11-23 08:46:12.354358441 +0000 UTC m=+0.224064944 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux ) Nov 23 03:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:46:14 localhost podman[98377]: 2025-11-23 08:46:14.037132838 +0000 UTC m=+0.090592839 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Nov 23 03:46:14 localhost podman[98377]: 2025-11-23 08:46:14.434622975 +0000 UTC m=+0.488082936 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 23 03:46:14 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:46:16 localhost systemd[1]: tmp-crun.LdRzsv.mount: Deactivated successfully. Nov 23 03:46:16 localhost podman[98418]: 2025-11-23 08:46:16.081308642 +0000 UTC m=+0.130725052 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:46:16 localhost podman[98416]: 2025-11-23 08:46:16.049187145 +0000 UTC m=+0.104101464 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:46:16 localhost podman[98418]: 2025-11-23 08:46:16.118549519 +0000 UTC m=+0.167965899 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:46:16 localhost podman[98418]: unhealthy Nov 23 03:46:16 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:46:16 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:46:16 localhost podman[98417]: 2025-11-23 08:46:16.185301721 +0000 UTC m=+0.238292311 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 23 03:46:16 localhost podman[98417]: 2025-11-23 08:46:16.201331675 +0000 UTC m=+0.254322265 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 23 03:46:16 localhost podman[98417]: unhealthy Nov 23 03:46:16 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:46:16 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:46:16 localhost podman[98416]: 2025-11-23 08:46:16.266215941 +0000 UTC m=+0.321130310 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:46:16 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:46:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:46:36 localhost recover_tripleo_nova_virtqemud[98480]: 61756 Nov 23 03:46:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:46:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:46:40 localhost podman[98482]: 2025-11-23 08:46:40.03999585 +0000 UTC m=+0.091509766 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:46:40 localhost podman[98481]: 2025-11-23 08:46:40.092816665 +0000 UTC m=+0.144407114 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:46:40 localhost podman[98481]: 2025-11-23 08:46:40.104383731 +0000 UTC m=+0.155974220 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z) Nov 23 03:46:40 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:46:40 localhost podman[98482]: 2025-11-23 08:46:40.120275829 +0000 UTC m=+0.171789795 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 23 03:46:40 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:46:42 localhost podman[98519]: 2025-11-23 08:46:42.030643607 +0000 UTC m=+0.087230564 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:46:42 localhost podman[98519]: 2025-11-23 08:46:42.042400929 +0000 UTC m=+0.098987946 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., tcib_managed=true) Nov 23 03:46:42 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:46:42 localhost podman[98527]: 2025-11-23 08:46:42.089796827 +0000 UTC m=+0.134176069 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:46:42 localhost podman[98527]: 2025-11-23 08:46:42.147517153 +0000 UTC m=+0.191896405 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=nova_compute, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 23 03:46:42 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:46:42 localhost podman[98520]: 2025-11-23 08:46:42.127734384 +0000 UTC m=+0.180264276 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:46:42 localhost podman[98521]: 2025-11-23 08:46:42.153465926 +0000 UTC m=+0.201687316 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:46:42 localhost podman[98521]: 2025-11-23 08:46:42.236523261 +0000 UTC m=+0.284744631 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:46:42 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:46:42 localhost podman[98520]: 2025-11-23 08:46:42.257265339 +0000 UTC m=+0.309795261 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:46:42 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:46:45 localhost podman[98617]: 2025-11-23 08:46:45.018963716 +0000 UTC m=+0.079013061 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4) Nov 23 03:46:45 localhost podman[98617]: 2025-11-23 08:46:45.417189227 +0000 UTC m=+0.477238492 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:46:45 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:46:47 localhost podman[98641]: 2025-11-23 08:46:47.023407649 +0000 UTC m=+0.078648711 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, vcs-type=git) Nov 23 03:46:47 localhost podman[98641]: 2025-11-23 08:46:47.03516306 +0000 UTC m=+0.090404202 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:46:47 localhost podman[98641]: unhealthy Nov 23 03:46:47 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:46:47 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:46:47 localhost systemd[1]: tmp-crun.tiC81l.mount: Deactivated successfully. Nov 23 03:46:47 localhost podman[98640]: 2025-11-23 08:46:47.096997683 +0000 UTC m=+0.153848734 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:46:47 localhost podman[98642]: 2025-11-23 08:46:47.099986594 +0000 UTC m=+0.150590393 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12) Nov 23 03:46:47 localhost podman[98642]: 2025-11-23 08:46:47.183504634 +0000 UTC m=+0.234108513 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com) Nov 23 03:46:47 localhost podman[98642]: unhealthy Nov 23 03:46:47 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:46:47 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:46:47 localhost podman[98640]: 2025-11-23 08:46:47.29001444 +0000 UTC m=+0.346865541 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:46:47 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:47:11 localhost podman[98711]: 2025-11-23 08:47:11.031881847 +0000 UTC m=+0.087111401 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd) Nov 23 03:47:11 localhost podman[98711]: 2025-11-23 08:47:11.041540984 +0000 UTC m=+0.096770528 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Nov 23 03:47:11 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:47:11 localhost podman[98712]: 2025-11-23 08:47:11.12919215 +0000 UTC m=+0.183082573 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:47:11 localhost podman[98712]: 2025-11-23 08:47:11.141368235 +0000 UTC m=+0.195258748 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:47:11 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:47:13 localhost systemd[1]: tmp-crun.TdSs2B.mount: Deactivated successfully. Nov 23 03:47:13 localhost podman[98751]: 2025-11-23 08:47:13.049408471 +0000 UTC m=+0.091488796 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64) Nov 23 03:47:13 localhost podman[98755]: 2025-11-23 08:47:13.104372692 +0000 UTC m=+0.138835802 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Nov 23 03:47:13 localhost podman[98751]: 2025-11-23 08:47:13.132307311 +0000 UTC m=+0.174387656 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:47:13 localhost podman[98755]: 2025-11-23 08:47:13.142858656 +0000 UTC m=+0.177321786 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git) Nov 23 03:47:13 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:47:13 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:47:13 localhost podman[98750]: 2025-11-23 08:47:13.212023954 +0000 UTC m=+0.257256865 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 23 03:47:13 localhost podman[98750]: 2025-11-23 08:47:13.2492915 +0000 UTC m=+0.294524421 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=) Nov 23 03:47:13 localhost podman[98752]: 2025-11-23 08:47:13.261212866 +0000 UTC m=+0.299280207 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:47:13 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:47:13 localhost podman[98752]: 2025-11-23 08:47:13.316064314 +0000 UTC m=+0.354131655 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:47:13 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:47:15 localhost podman[98925]: 2025-11-23 08:47:15.886680224 +0000 UTC m=+0.082728147 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044) Nov 23 03:47:16 localhost podman[98925]: 2025-11-23 08:47:16.238740193 +0000 UTC m=+0.434788136 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 23 03:47:16 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:47:18 localhost systemd[1]: tmp-crun.xohBtN.mount: Deactivated successfully. Nov 23 03:47:18 localhost podman[98947]: 2025-11-23 08:47:18.055060428 +0000 UTC m=+0.101365779 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:47:18 localhost podman[98949]: 2025-11-23 08:47:18.091947673 +0000 UTC m=+0.131275140 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:47:18 localhost podman[98949]: 2025-11-23 08:47:18.112203336 +0000 UTC m=+0.151530763 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64) Nov 23 03:47:18 localhost podman[98949]: unhealthy Nov 23 03:47:18 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:47:18 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:47:18 localhost podman[98948]: 2025-11-23 08:47:18.203912047 +0000 UTC m=+0.247473243 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., release=1761123044, version=17.1.12) Nov 23 03:47:18 localhost podman[98948]: 2025-11-23 08:47:18.217801925 +0000 UTC m=+0.261363081 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4) Nov 23 03:47:18 localhost podman[98948]: unhealthy Nov 23 03:47:18 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:47:18 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:47:18 localhost podman[98947]: 2025-11-23 08:47:18.283523046 +0000 UTC m=+0.329828387 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:47:18 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:47:19 localhost systemd[1]: tmp-crun.lGBhFK.mount: Deactivated successfully. Nov 23 03:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:47:42 localhost podman[99012]: 2025-11-23 08:47:42.029159059 +0000 UTC m=+0.077955009 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 03:47:42 localhost podman[99012]: 2025-11-23 08:47:42.04316979 +0000 UTC m=+0.091965670 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12) Nov 23 03:47:42 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:47:42 localhost systemd[1]: tmp-crun.zS9qzU.mount: Deactivated successfully. Nov 23 03:47:42 localhost podman[99011]: 2025-11-23 08:47:42.137078708 +0000 UTC m=+0.186397784 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:47:42 localhost podman[99011]: 2025-11-23 08:47:42.169826356 +0000 UTC m=+0.219145432 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd) Nov 23 03:47:42 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:47:44 localhost podman[99054]: 2025-11-23 08:47:44.04332517 +0000 UTC m=+0.092583089 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:47:44 localhost podman[99054]: 2025-11-23 08:47:44.075566971 +0000 UTC m=+0.124824860 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:47:44 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:47:44 localhost podman[99055]: 2025-11-23 08:47:44.093181143 +0000 UTC m=+0.139014587 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:47:44 localhost podman[99055]: 2025-11-23 08:47:44.131178192 +0000 UTC m=+0.177011646 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:47:44 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:47:44 localhost podman[99052]: 2025-11-23 08:47:44.134886056 +0000 UTC m=+0.190274235 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:47:44 localhost podman[99052]: 2025-11-23 08:47:44.219474489 +0000 UTC m=+0.274862678 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:47:44 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:47:44 localhost podman[99053]: 2025-11-23 08:47:44.187326319 +0000 UTC m=+0.239357163 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:47:44 localhost podman[99053]: 2025-11-23 08:47:44.267163436 +0000 UTC m=+0.319194340 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4) Nov 23 03:47:44 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:47:45 localhost systemd[1]: tmp-crun.rUNSi3.mount: Deactivated successfully. Nov 23 03:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:47:47 localhost podman[99155]: 2025-11-23 08:47:47.002539133 +0000 UTC m=+0.055393305 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:47:47 localhost podman[99155]: 2025-11-23 08:47:47.437245436 +0000 UTC m=+0.490099598 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:47:47 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:47:48 localhost sshd[99176]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:47:48 localhost systemd[1]: tmp-crun.sRurOu.mount: Deactivated successfully. Nov 23 03:47:48 localhost podman[99178]: 2025-11-23 08:47:48.831070314 +0000 UTC m=+0.100714830 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:47:48 localhost systemd[1]: tmp-crun.KjCCNX.mount: Deactivated successfully. Nov 23 03:47:48 localhost podman[99179]: 2025-11-23 08:47:48.92977948 +0000 UTC m=+0.192279226 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team) Nov 23 03:47:48 localhost podman[99180]: 2025-11-23 08:47:48.886084595 +0000 UTC m=+0.148905251 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 23 03:47:48 localhost podman[99179]: 2025-11-23 08:47:48.94309046 +0000 UTC m=+0.205590226 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, tcib_managed=true) Nov 23 03:47:48 localhost podman[99179]: unhealthy Nov 23 03:47:48 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:47:48 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:47:48 localhost podman[99180]: 2025-11-23 08:47:48.969539653 +0000 UTC m=+0.232360309 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:47:48 localhost podman[99180]: unhealthy Nov 23 03:47:48 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:47:48 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:47:49 localhost podman[99178]: 2025-11-23 08:47:49.01202503 +0000 UTC m=+0.281669546 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, distribution-scope=public) Nov 23 03:47:49 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:48:13 localhost podman[99245]: 2025-11-23 08:48:13.014832642 +0000 UTC m=+0.064857246 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 23 03:48:13 localhost podman[99245]: 2025-11-23 08:48:13.021802207 +0000 UTC m=+0.071826761 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container) Nov 23 03:48:13 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:48:13 localhost systemd[1]: tmp-crun.UESNvA.mount: Deactivated successfully. Nov 23 03:48:13 localhost podman[99244]: 2025-11-23 08:48:13.085359672 +0000 UTC m=+0.136064917 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z) Nov 23 03:48:13 localhost podman[99244]: 2025-11-23 08:48:13.093646217 +0000 UTC m=+0.144351502 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:48:13 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:48:15 localhost systemd[1]: tmp-crun.JqHYyx.mount: Deactivated successfully. Nov 23 03:48:15 localhost podman[99284]: 2025-11-23 08:48:15.04156183 +0000 UTC m=+0.098097018 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true) Nov 23 03:48:15 localhost podman[99284]: 2025-11-23 08:48:15.047482642 +0000 UTC m=+0.104017860 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 23 03:48:15 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:48:15 localhost podman[99286]: 2025-11-23 08:48:15.14167637 +0000 UTC m=+0.191426720 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:48:15 localhost podman[99286]: 2025-11-23 08:48:15.172274602 +0000 UTC m=+0.222024942 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 23 03:48:15 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:48:15 localhost podman[99292]: 2025-11-23 08:48:15.188964895 +0000 UTC m=+0.235260418 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public) Nov 23 03:48:15 localhost podman[99292]: 2025-11-23 08:48:15.219236836 +0000 UTC m=+0.265532359 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com) Nov 23 03:48:15 localhost podman[99285]: 2025-11-23 08:48:15.241917773 +0000 UTC m=+0.296584644 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute) Nov 23 03:48:15 localhost podman[99285]: 2025-11-23 08:48:15.276301802 +0000 UTC m=+0.330968653 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:48:15 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:48:15 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:48:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:48:16 localhost recover_tripleo_nova_virtqemud[99448]: 61756 Nov 23 03:48:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:48:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:48:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:48:18 localhost podman[99463]: 2025-11-23 08:48:18.131706092 +0000 UTC m=+0.086854543 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:48:18 localhost podman[99463]: 2025-11-23 08:48:18.502380124 +0000 UTC m=+0.457528625 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Nov 23 03:48:18 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:48:20 localhost podman[99491]: 2025-11-23 08:48:20.01444279 +0000 UTC m=+0.068809537 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:48:20 localhost podman[99491]: 2025-11-23 08:48:20.031307276 +0000 UTC m=+0.085673923 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:48:20 localhost podman[99491]: unhealthy Nov 23 03:48:20 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:48:20 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:48:20 localhost podman[99490]: 2025-11-23 08:48:20.079234033 +0000 UTC m=+0.134988342 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:48:20 localhost podman[99489]: 2025-11-23 08:48:20.123160477 +0000 UTC m=+0.181126594 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr) Nov 23 03:48:20 localhost podman[99490]: 2025-11-23 08:48:20.143383926 +0000 UTC m=+0.199138265 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:48:20 localhost podman[99490]: unhealthy Nov 23 03:48:20 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:48:20 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:48:20 localhost podman[99489]: 2025-11-23 08:48:20.296961336 +0000 UTC m=+0.354927473 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, url=https://www.redhat.com, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:48:20 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:48:44 localhost systemd[1]: tmp-crun.073sXJ.mount: Deactivated successfully. Nov 23 03:48:44 localhost podman[99561]: 2025-11-23 08:48:44.035531573 +0000 UTC m=+0.088074357 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 23 03:48:44 localhost podman[99561]: 2025-11-23 08:48:44.043139855 +0000 UTC m=+0.095682669 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=) Nov 23 03:48:44 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:48:44 localhost systemd[1]: tmp-crun.C4fJ7z.mount: Deactivated successfully. Nov 23 03:48:44 localhost podman[99560]: 2025-11-23 08:48:44.13413265 +0000 UTC m=+0.187540211 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, config_id=tripleo_step3, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:48:44 localhost podman[99560]: 2025-11-23 08:48:44.171298617 +0000 UTC m=+0.224706138 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.12) Nov 23 03:48:44 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:48:46 localhost podman[99603]: 2025-11-23 08:48:46.047029693 +0000 UTC m=+0.087513439 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z) Nov 23 03:48:46 localhost podman[99603]: 2025-11-23 08:48:46.09432574 +0000 UTC m=+0.134809476 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:48:46 localhost podman[99602]: 2025-11-23 08:48:46.094264158 +0000 UTC m=+0.140473850 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:48:46 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:48:46 localhost podman[99609]: 2025-11-23 08:48:46.152452068 +0000 UTC m=+0.191489761 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Nov 23 03:48:46 localhost podman[99602]: 2025-11-23 08:48:46.179376043 +0000 UTC m=+0.225585755 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:48:46 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:48:46 localhost podman[99609]: 2025-11-23 08:48:46.204051328 +0000 UTC m=+0.243089061 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step5, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container) Nov 23 03:48:46 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:48:46 localhost podman[99601]: 2025-11-23 08:48:46.190124732 +0000 UTC m=+0.239120569 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:48:46 localhost podman[99601]: 2025-11-23 08:48:46.269595754 +0000 UTC m=+0.318591551 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 23 03:48:46 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:48:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:48:49 localhost podman[99700]: 2025-11-23 08:48:49.026175257 +0000 UTC m=+0.080366720 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:48:49 localhost podman[99700]: 2025-11-23 08:48:49.396340516 +0000 UTC m=+0.450531969 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:48:49 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:48:51 localhost systemd[1]: tmp-crun.9ntNS9.mount: Deactivated successfully. Nov 23 03:48:51 localhost podman[99723]: 2025-11-23 08:48:51.070610256 +0000 UTC m=+0.129769633 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Nov 23 03:48:51 localhost podman[99724]: 2025-11-23 08:48:51.019222833 +0000 UTC m=+0.078409030 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:48:51 localhost podman[99725]: 2025-11-23 08:48:51.040103102 +0000 UTC m=+0.093442581 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, architecture=x86_64) Nov 23 03:48:51 localhost podman[99724]: 2025-11-23 08:48:51.104193233 +0000 UTC m=+0.163379410 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:48:51 localhost podman[99724]: unhealthy Nov 23 03:48:51 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:48:51 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:48:51 localhost podman[99725]: 2025-11-23 08:48:51.173963319 +0000 UTC m=+0.227302778 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:48:51 localhost podman[99725]: unhealthy Nov 23 03:48:51 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:48:51 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:48:51 localhost podman[99723]: 2025-11-23 08:48:51.262174088 +0000 UTC m=+0.321333395 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4) Nov 23 03:48:51 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:48:52 localhost systemd[1]: tmp-crun.36qpjl.mount: Deactivated successfully. Nov 23 03:49:04 localhost systemd-logind[761]: Session 28 logged out. Waiting for processes to exit. Nov 23 03:49:04 localhost systemd[1]: session-28.scope: Deactivated successfully. Nov 23 03:49:04 localhost systemd[1]: session-28.scope: Consumed 6min 57.879s CPU time. Nov 23 03:49:04 localhost systemd-logind[761]: Removed session 28. Nov 23 03:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:49:14 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 23 03:49:14 localhost systemd[36148]: Activating special unit Exit the Session... Nov 23 03:49:14 localhost systemd[36148]: Removed slice User Background Tasks Slice. Nov 23 03:49:14 localhost systemd[36148]: Stopped target Main User Target. Nov 23 03:49:14 localhost systemd[36148]: Stopped target Basic System. Nov 23 03:49:14 localhost systemd[36148]: Stopped target Paths. Nov 23 03:49:14 localhost systemd[36148]: Stopped target Sockets. Nov 23 03:49:14 localhost systemd[36148]: Stopped target Timers. Nov 23 03:49:14 localhost systemd[36148]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 23 03:49:14 localhost systemd[36148]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 03:49:14 localhost systemd[36148]: Closed D-Bus User Message Bus Socket. Nov 23 03:49:14 localhost systemd[36148]: Stopped Create User's Volatile Files and Directories. Nov 23 03:49:14 localhost systemd[36148]: Removed slice User Application Slice. Nov 23 03:49:14 localhost systemd[36148]: Reached target Shutdown. Nov 23 03:49:14 localhost systemd[36148]: Finished Exit the Session. Nov 23 03:49:14 localhost systemd[36148]: Reached target Exit the Session. Nov 23 03:49:14 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 23 03:49:14 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 23 03:49:14 localhost systemd[1]: user@1003.service: Consumed 4.964s CPU time, read 0B from disk, written 7.0K to disk. Nov 23 03:49:14 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 23 03:49:14 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 23 03:49:14 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 23 03:49:14 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 23 03:49:14 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 23 03:49:14 localhost systemd[1]: user-1003.slice: Consumed 7min 2.873s CPU time. Nov 23 03:49:14 localhost podman[99792]: 2025-11-23 08:49:14.790427342 +0000 UTC m=+0.090937244 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:49:14 localhost podman[99792]: 2025-11-23 08:49:14.804655667 +0000 UTC m=+0.105165589 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:49:14 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:49:14 localhost systemd[1]: tmp-crun.HS480i.mount: Deactivated successfully. Nov 23 03:49:14 localhost podman[99791]: 2025-11-23 08:49:14.863025894 +0000 UTC m=+0.166647081 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Nov 23 03:49:14 localhost podman[99791]: 2025-11-23 08:49:14.877448785 +0000 UTC m=+0.181069982 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:49:14 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:49:17 localhost systemd[1]: tmp-crun.JErjMx.mount: Deactivated successfully. Nov 23 03:49:17 localhost podman[99832]: 2025-11-23 08:49:17.446138637 +0000 UTC m=+0.501846439 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container) Nov 23 03:49:17 localhost podman[99833]: 2025-11-23 08:49:17.455170754 +0000 UTC m=+0.505108760 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044) Nov 23 03:49:17 localhost podman[99832]: 2025-11-23 08:49:17.459351732 +0000 UTC m=+0.515059534 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=) Nov 23 03:49:17 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:49:17 localhost podman[99833]: 2025-11-23 08:49:17.489930208 +0000 UTC m=+0.539868184 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:49:17 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:49:17 localhost podman[99840]: 2025-11-23 08:49:17.542050642 +0000 UTC m=+0.584709714 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:49:17 localhost podman[99834]: 2025-11-23 08:49:17.511307482 +0000 UTC m=+0.557054159 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Nov 23 03:49:17 localhost podman[99840]: 2025-11-23 08:49:17.56974225 +0000 UTC m=+0.612401322 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:49:17 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:49:17 localhost podman[99834]: 2025-11-23 08:49:17.59128904 +0000 UTC m=+0.637035687 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4) Nov 23 03:49:17 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:49:18 localhost systemd[1]: tmp-crun.jyQgbs.mount: Deactivated successfully. Nov 23 03:49:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:49:19 localhost podman[100003]: 2025-11-23 08:49:19.787703669 +0000 UTC m=+0.094432631 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 23 03:49:20 localhost podman[100003]: 2025-11-23 08:49:20.164647816 +0000 UTC m=+0.471376738 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:49:20 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:49:22 localhost systemd[1]: tmp-crun.xjAfqZ.mount: Deactivated successfully. Nov 23 03:49:22 localhost systemd[1]: tmp-crun.0jPBj8.mount: Deactivated successfully. Nov 23 03:49:22 localhost podman[100027]: 2025-11-23 08:49:22.043827127 +0000 UTC m=+0.093562504 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044) Nov 23 03:49:22 localhost podman[100026]: 2025-11-23 08:49:22.100051267 +0000 UTC m=+0.152743655 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:49:22 localhost podman[100027]: 2025-11-23 08:49:22.12463694 +0000 UTC m=+0.174372267 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Nov 23 03:49:22 localhost podman[100028]: 2025-11-23 08:49:22.072753273 +0000 UTC m=+0.116388014 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, release=1761123044, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:49:22 localhost podman[100027]: unhealthy Nov 23 03:49:22 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:49:22 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:49:22 localhost podman[100028]: 2025-11-23 08:49:22.210259511 +0000 UTC m=+0.253894262 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 23 03:49:22 localhost podman[100028]: unhealthy Nov 23 03:49:22 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:49:22 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:49:22 localhost podman[100026]: 2025-11-23 08:49:22.268314188 +0000 UTC m=+0.321006586 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=) Nov 23 03:49:22 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:49:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:49:36 localhost recover_tripleo_nova_virtqemud[100095]: 61756 Nov 23 03:49:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:49:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:49:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:49:45 localhost systemd[1]: tmp-crun.tAHTa1.mount: Deactivated successfully. Nov 23 03:49:45 localhost podman[100097]: 2025-11-23 08:49:45.047327746 +0000 UTC m=+0.100059673 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, container_name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:49:45 localhost podman[100096]: 2025-11-23 08:49:45.089098954 +0000 UTC m=+0.142180402 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Nov 23 03:49:45 localhost podman[100096]: 2025-11-23 08:49:45.102304158 +0000 UTC m=+0.155385596 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=collectd, vcs-type=git) Nov 23 03:49:45 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:49:45 localhost podman[100097]: 2025-11-23 08:49:45.130192512 +0000 UTC m=+0.182924419 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:49:45 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:49:48 localhost systemd[1]: tmp-crun.W8V17T.mount: Deactivated successfully. Nov 23 03:49:48 localhost podman[100133]: 2025-11-23 08:49:48.044142332 +0000 UTC m=+0.097122374 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:49:48 localhost systemd[1]: tmp-crun.WykkDF.mount: Deactivated successfully. Nov 23 03:49:48 localhost podman[100132]: 2025-11-23 08:49:48.090550811 +0000 UTC m=+0.146697280 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Nov 23 03:49:48 localhost podman[100132]: 2025-11-23 08:49:48.104188539 +0000 UTC m=+0.160334998 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:49:48 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:49:48 localhost podman[100134]: 2025-11-23 08:49:48.120733205 +0000 UTC m=+0.175125401 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 23 03:49:48 localhost podman[100134]: 2025-11-23 08:49:48.149227557 +0000 UTC m=+0.203619753 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:49:48 localhost podman[100135]: 2025-11-23 08:49:48.105368675 +0000 UTC m=+0.149882178 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1761123044, url=https://www.redhat.com, tcib_managed=true) Nov 23 03:49:48 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:49:48 localhost podman[100133]: 2025-11-23 08:49:48.172861141 +0000 UTC m=+0.225841143 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Nov 23 03:49:48 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:49:48 localhost podman[100135]: 2025-11-23 08:49:48.189403337 +0000 UTC m=+0.233916870 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Nov 23 03:49:48 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:49:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:49:51 localhost systemd[1]: tmp-crun.T14XBP.mount: Deactivated successfully. Nov 23 03:49:51 localhost podman[100229]: 2025-11-23 08:49:51.029940599 +0000 UTC m=+0.086619391 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:49:51 localhost podman[100229]: 2025-11-23 08:49:51.399435978 +0000 UTC m=+0.456114740 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, vcs-type=git) Nov 23 03:49:51 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:49:53 localhost podman[100252]: 2025-11-23 08:49:53.040743179 +0000 UTC m=+0.100060093 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com) Nov 23 03:49:53 localhost podman[100254]: 2025-11-23 08:49:53.09434074 +0000 UTC m=+0.145554436 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, batch=17.1_20251118.1) Nov 23 03:49:53 localhost podman[100254]: 2025-11-23 08:49:53.109282678 +0000 UTC m=+0.160496394 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:49:53 localhost podman[100254]: unhealthy Nov 23 03:49:53 localhost podman[100253]: 2025-11-23 08:49:53.01820161 +0000 UTC m=+0.076068109 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible) Nov 23 03:49:53 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:49:53 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:49:53 localhost podman[100253]: 2025-11-23 08:49:53.151477309 +0000 UTC m=+0.209343838 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 23 03:49:53 localhost podman[100253]: unhealthy Nov 23 03:49:53 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:49:53 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:49:53 localhost podman[100252]: 2025-11-23 08:49:53.271369658 +0000 UTC m=+0.330686582 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Nov 23 03:49:53 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:50:16 localhost systemd[1]: tmp-crun.uoNyNc.mount: Deactivated successfully. Nov 23 03:50:16 localhost podman[100320]: 2025-11-23 08:50:16.011011979 +0000 UTC m=+0.071242442 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:50:16 localhost podman[100319]: 2025-11-23 08:50:16.072410938 +0000 UTC m=+0.131768693 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z) Nov 23 03:50:16 localhost podman[100319]: 2025-11-23 08:50:16.080854787 +0000 UTC m=+0.140212522 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:50:16 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:50:16 localhost podman[100320]: 2025-11-23 08:50:16.101692964 +0000 UTC m=+0.161923447 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, container_name=iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:50:16 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:50:19 localhost podman[100364]: 2025-11-23 08:50:19.035273105 +0000 UTC m=+0.079248956 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_id=tripleo_step5, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 23 03:50:19 localhost systemd[1]: tmp-crun.4W3ekX.mount: Deactivated successfully. Nov 23 03:50:19 localhost podman[100358]: 2025-11-23 08:50:19.094168088 +0000 UTC m=+0.149240528 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Nov 23 03:50:19 localhost podman[100358]: 2025-11-23 08:50:19.105252067 +0000 UTC m=+0.160324517 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Nov 23 03:50:19 localhost podman[100364]: 2025-11-23 08:50:19.112224071 +0000 UTC m=+0.156199932 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:50:19 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:50:19 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:50:19 localhost podman[100359]: 2025-11-23 08:50:19.192966072 +0000 UTC m=+0.242938376 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:50:19 localhost podman[100360]: 2025-11-23 08:50:19.250115401 +0000 UTC m=+0.296307850 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1) Nov 23 03:50:19 localhost podman[100359]: 2025-11-23 08:50:19.253276147 +0000 UTC m=+0.303248431 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:50:19 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:50:19 localhost podman[100360]: 2025-11-23 08:50:19.280244233 +0000 UTC m=+0.326436632 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 23 03:50:19 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:50:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:50:22 localhost systemd[1]: tmp-crun.u9rl1F.mount: Deactivated successfully. Nov 23 03:50:22 localhost podman[100532]: 2025-11-23 08:50:22.037690213 +0000 UTC m=+0.090475590 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:50:22 localhost podman[100532]: 2025-11-23 08:50:22.416232988 +0000 UTC m=+0.469018305 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:50:22 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:50:24 localhost podman[100555]: 2025-11-23 08:50:24.029421849 +0000 UTC m=+0.085107236 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 23 03:50:24 localhost podman[100557]: 2025-11-23 08:50:24.093994025 +0000 UTC m=+0.141389058 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12) Nov 23 03:50:24 localhost podman[100557]: 2025-11-23 08:50:24.108606542 +0000 UTC m=+0.156001595 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:50:24 localhost podman[100557]: unhealthy Nov 23 03:50:24 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:50:24 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:50:24 localhost podman[100556]: 2025-11-23 08:50:24.187201077 +0000 UTC m=+0.236791527 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:50:24 localhost podman[100556]: 2025-11-23 08:50:24.204276229 +0000 UTC m=+0.253866679 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:50:24 localhost podman[100556]: unhealthy Nov 23 03:50:24 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:50:24 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:50:24 localhost podman[100555]: 2025-11-23 08:50:24.265337748 +0000 UTC m=+0.321023175 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:50:24 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:50:47 localhost podman[100626]: 2025-11-23 08:50:47.006300143 +0000 UTC m=+0.059877043 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team) Nov 23 03:50:47 localhost podman[100626]: 2025-11-23 08:50:47.015350339 +0000 UTC m=+0.068927219 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:50:47 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:50:47 localhost systemd[1]: tmp-crun.xWFXUX.mount: Deactivated successfully. Nov 23 03:50:47 localhost podman[100625]: 2025-11-23 08:50:47.07484811 +0000 UTC m=+0.129973178 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 23 03:50:47 localhost podman[100625]: 2025-11-23 08:50:47.088281221 +0000 UTC m=+0.143406289 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:50:47 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:50:48 localhost sshd[100665]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:50:49 localhost sshd[100667]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:50:49 localhost podman[100670]: 2025-11-23 08:50:49.825043709 +0000 UTC m=+0.065923418 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:50:49 localhost podman[100671]: 2025-11-23 08:50:49.897036673 +0000 UTC m=+0.135859300 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4) Nov 23 03:50:49 localhost podman[100672]: 2025-11-23 08:50:49.855490961 +0000 UTC m=+0.088115657 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:50:49 localhost podman[100670]: 2025-11-23 08:50:49.91199369 +0000 UTC m=+0.152873409 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, version=17.1.12, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:50:49 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:50:49 localhost podman[100671]: 2025-11-23 08:50:49.949688814 +0000 UTC m=+0.188511401 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.12, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git) Nov 23 03:50:49 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:50:49 localhost podman[100669]: 2025-11-23 08:50:49.989024498 +0000 UTC m=+0.229268088 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:50:50 localhost podman[100669]: 2025-11-23 08:50:50.001160579 +0000 UTC m=+0.241404159 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, version=17.1.12) Nov 23 03:50:50 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:50:50 localhost podman[100672]: 2025-11-23 08:50:50.041041739 +0000 UTC m=+0.273666455 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute) Nov 23 03:50:50 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:50:50 localhost systemd[1]: tmp-crun.wDQCdP.mount: Deactivated successfully. Nov 23 03:50:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:50:53 localhost systemd[1]: tmp-crun.QezOms.mount: Deactivated successfully. Nov 23 03:50:53 localhost podman[100768]: 2025-11-23 08:50:53.039240728 +0000 UTC m=+0.095805534 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true) Nov 23 03:50:53 localhost podman[100768]: 2025-11-23 08:50:53.421399623 +0000 UTC m=+0.477964379 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:50:53 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:50:55 localhost systemd[1]: tmp-crun.d9QnHA.mount: Deactivated successfully. Nov 23 03:50:55 localhost podman[100793]: 2025-11-23 08:50:55.031425277 +0000 UTC m=+0.088122498 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:14:25Z) Nov 23 03:50:55 localhost podman[100793]: 2025-11-23 08:50:55.077825087 +0000 UTC m=+0.134522358 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, version=17.1.12) Nov 23 03:50:55 localhost podman[100793]: unhealthy Nov 23 03:50:55 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:50:55 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:50:55 localhost podman[100794]: 2025-11-23 08:50:55.119056569 +0000 UTC m=+0.171706186 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 23 03:50:55 localhost podman[100792]: 2025-11-23 08:50:55.079682144 +0000 UTC m=+0.136518739 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 23 03:50:55 localhost podman[100794]: 2025-11-23 08:50:55.136792231 +0000 UTC m=+0.189441888 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:50:55 localhost podman[100794]: unhealthy Nov 23 03:50:55 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:50:55 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:50:55 localhost podman[100792]: 2025-11-23 08:50:55.247342615 +0000 UTC m=+0.304179220 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:50:55 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:51:11 localhost sshd[100861]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:51:18 localhost podman[100863]: 2025-11-23 08:51:18.024494016 +0000 UTC m=+0.081316739 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Nov 23 03:51:18 localhost systemd[1]: tmp-crun.Vmfrwa.mount: Deactivated successfully. Nov 23 03:51:18 localhost podman[100864]: 2025-11-23 08:51:18.069108082 +0000 UTC m=+0.123758469 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:51:18 localhost podman[100863]: 2025-11-23 08:51:18.091171877 +0000 UTC m=+0.147994610 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:51:18 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:51:18 localhost podman[100864]: 2025-11-23 08:51:18.103282898 +0000 UTC m=+0.157933225 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:44:13Z, tcib_managed=true) Nov 23 03:51:18 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:51:21 localhost systemd[1]: tmp-crun.ElOvjL.mount: Deactivated successfully. Nov 23 03:51:21 localhost podman[100903]: 2025-11-23 08:51:21.046815582 +0000 UTC m=+0.098873297 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:51:21 localhost podman[100909]: 2025-11-23 08:51:21.081281417 +0000 UTC m=+0.129420982 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:51:21 localhost podman[100903]: 2025-11-23 08:51:21.095778401 +0000 UTC m=+0.147836116 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git) Nov 23 03:51:21 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:51:21 localhost podman[100902]: 2025-11-23 08:51:21.137688163 +0000 UTC m=+0.192141831 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, version=17.1.12) Nov 23 03:51:21 localhost podman[100909]: 2025-11-23 08:51:21.159029816 +0000 UTC m=+0.207169381 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:51:21 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:51:21 localhost podman[100902]: 2025-11-23 08:51:21.211008247 +0000 UTC m=+0.265461955 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:51:21 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:51:21 localhost podman[100901]: 2025-11-23 08:51:21.010615024 +0000 UTC m=+0.071094007 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1) Nov 23 03:51:21 localhost podman[100901]: 2025-11-23 08:51:21.29541462 +0000 UTC m=+0.355893583 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron) Nov 23 03:51:21 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:51:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:51:24 localhost systemd[1]: tmp-crun.PXajo2.mount: Deactivated successfully. Nov 23 03:51:24 localhost podman[101074]: 2025-11-23 08:51:24.040419699 +0000 UTC m=+0.094226495 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target) Nov 23 03:51:24 localhost podman[101074]: 2025-11-23 08:51:24.432519079 +0000 UTC m=+0.486325905 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:51:24 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:51:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:51:25 localhost recover_tripleo_nova_virtqemud[101117]: 61756 Nov 23 03:51:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:51:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:51:25 localhost podman[101099]: 2025-11-23 08:51:25.984871948 +0000 UTC m=+0.046825564 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:51:25 localhost podman[101099]: 2025-11-23 08:51:25.997162684 +0000 UTC m=+0.059116310 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:51:26 localhost podman[101099]: unhealthy Nov 23 03:51:26 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:51:26 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:51:26 localhost podman[101098]: 2025-11-23 08:51:26.067971102 +0000 UTC m=+0.127928447 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=) Nov 23 03:51:26 localhost podman[101100]: 2025-11-23 08:51:26.040516711 +0000 UTC m=+0.097864846 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:51:26 localhost podman[101100]: 2025-11-23 08:51:26.125393479 +0000 UTC m=+0.182741644 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:51:26 localhost podman[101100]: unhealthy Nov 23 03:51:26 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:51:26 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:51:26 localhost podman[101098]: 2025-11-23 08:51:26.226226665 +0000 UTC m=+0.286183960 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:51:26 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:51:26 localhost systemd[1]: tmp-crun.0sb7eT.mount: Deactivated successfully. Nov 23 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:51:49 localhost podman[101169]: 2025-11-23 08:51:49.03519269 +0000 UTC m=+0.092536303 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 23 03:51:49 localhost podman[101169]: 2025-11-23 08:51:49.046775895 +0000 UTC m=+0.104119518 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 23 03:51:49 localhost podman[101170]: 2025-11-23 08:51:49.086521661 +0000 UTC m=+0.141126200 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Nov 23 03:51:49 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:51:49 localhost podman[101170]: 2025-11-23 08:51:49.124274227 +0000 UTC m=+0.178878756 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:51:49 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:51:52 localhost systemd[1]: tmp-crun.oIZQSo.mount: Deactivated successfully. Nov 23 03:51:52 localhost podman[101210]: 2025-11-23 08:51:52.023101574 +0000 UTC m=+0.073187950 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi) Nov 23 03:51:52 localhost podman[101212]: 2025-11-23 08:51:52.084378 +0000 UTC m=+0.129411442 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 23 03:51:52 localhost podman[101212]: 2025-11-23 08:51:52.136263057 +0000 UTC m=+0.181296489 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:51:52 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:51:52 localhost podman[101209]: 2025-11-23 08:51:52.052009419 +0000 UTC m=+0.103939942 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:51:52 localhost podman[101210]: 2025-11-23 08:51:52.161632334 +0000 UTC m=+0.211718700 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044) Nov 23 03:51:52 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:51:52 localhost podman[101209]: 2025-11-23 08:51:52.186370311 +0000 UTC m=+0.238300814 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Nov 23 03:51:52 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:51:52 localhost podman[101208]: 2025-11-23 08:51:52.13762655 +0000 UTC m=+0.192127432 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 23 03:51:52 localhost podman[101208]: 2025-11-23 08:51:52.267931787 +0000 UTC m=+0.322432679 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Nov 23 03:51:52 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:51:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:51:55 localhost podman[101302]: 2025-11-23 08:51:55.019234768 +0000 UTC m=+0.074246233 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target) Nov 23 03:51:55 localhost podman[101302]: 2025-11-23 08:51:55.383982401 +0000 UTC m=+0.438993856 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, container_name=nova_migration_target, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Nov 23 03:51:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:51:57 localhost systemd[1]: tmp-crun.sEjoxg.mount: Deactivated successfully. Nov 23 03:51:57 localhost podman[101325]: 2025-11-23 08:51:57.022392274 +0000 UTC m=+0.082209707 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 23 03:51:57 localhost podman[101327]: 2025-11-23 08:51:57.03339368 +0000 UTC m=+0.086567630 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4) Nov 23 03:51:57 localhost podman[101326]: 2025-11-23 08:51:57.072790136 +0000 UTC m=+0.129267607 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:51:57 localhost podman[101326]: 2025-11-23 08:51:57.089227019 +0000 UTC m=+0.145704510 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, tcib_managed=true) Nov 23 03:51:57 localhost podman[101326]: unhealthy Nov 23 03:51:57 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:51:57 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:51:57 localhost podman[101327]: 2025-11-23 08:51:57.126415817 +0000 UTC m=+0.179589787 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc.) Nov 23 03:51:57 localhost podman[101327]: unhealthy Nov 23 03:51:57 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:51:57 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:51:57 localhost podman[101325]: 2025-11-23 08:51:57.219028582 +0000 UTC m=+0.278846015 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:51:57 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 03:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:52:20 localhost systemd[1]: tmp-crun.FpXuKl.mount: Deactivated successfully. Nov 23 03:52:20 localhost podman[101391]: 2025-11-23 08:52:20.027606165 +0000 UTC m=+0.081181016 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 03:52:20 localhost podman[101391]: 2025-11-23 08:52:20.060836932 +0000 UTC m=+0.114411773 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:52:20 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:52:20 localhost systemd[1]: tmp-crun.po3ACC.mount: Deactivated successfully. Nov 23 03:52:20 localhost podman[101392]: 2025-11-23 08:52:20.141208612 +0000 UTC m=+0.191232133 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:52:20 localhost podman[101392]: 2025-11-23 08:52:20.152271011 +0000 UTC m=+0.202294502 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid) Nov 23 03:52:20 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:52:23 localhost systemd[1]: tmp-crun.WoQQbs.mount: Deactivated successfully. Nov 23 03:52:23 localhost podman[101433]: 2025-11-23 08:52:23.031573701 +0000 UTC m=+0.083786985 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:52:23 localhost podman[101431]: 2025-11-23 08:52:23.014941712 +0000 UTC m=+0.073926074 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git) Nov 23 03:52:23 localhost podman[101429]: 2025-11-23 08:52:23.069047768 +0000 UTC m=+0.128645929 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Nov 23 03:52:23 localhost podman[101433]: 2025-11-23 08:52:23.081253551 +0000 UTC m=+0.133466845 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com) Nov 23 03:52:23 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:52:23 localhost podman[101431]: 2025-11-23 08:52:23.098100796 +0000 UTC m=+0.157085118 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4) Nov 23 03:52:23 localhost podman[101429]: 2025-11-23 08:52:23.104263276 +0000 UTC m=+0.163861397 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container) Nov 23 03:52:23 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:52:23 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:52:23 localhost podman[101435]: 2025-11-23 08:52:23.163013694 +0000 UTC m=+0.216567560 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64) Nov 23 03:52:23 localhost podman[101435]: 2025-11-23 08:52:23.186682798 +0000 UTC m=+0.240236664 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team) Nov 23 03:52:23 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:52:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:52:26 localhost podman[101603]: 2025-11-23 08:52:26.028906463 +0000 UTC m=+0.083929600 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:52:26 localhost podman[101603]: 2025-11-23 08:52:26.404722834 +0000 UTC m=+0.459745931 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Nov 23 03:52:26 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:52:28 localhost podman[101627]: 2025-11-23 08:52:28.02788078 +0000 UTC m=+0.084563169 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 23 03:52:28 localhost systemd[1]: tmp-crun.AcLKWd.mount: Deactivated successfully. Nov 23 03:52:28 localhost podman[101628]: 2025-11-23 08:52:28.088080663 +0000 UTC m=+0.141675497 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 03:52:28 localhost podman[101628]: 2025-11-23 08:52:28.100781791 +0000 UTC m=+0.154376605 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 23 03:52:28 localhost podman[101628]: unhealthy Nov 23 03:52:28 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:52:28 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:52:28 localhost systemd[1]: tmp-crun.12mAer.mount: Deactivated successfully. Nov 23 03:52:28 localhost podman[101629]: 2025-11-23 08:52:28.192924561 +0000 UTC m=+0.243326058 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, container_name=ovn_controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:52:28 localhost podman[101627]: 2025-11-23 08:52:28.234272477 +0000 UTC m=+0.290954876 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:52:28 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:52:28 localhost podman[101629]: 2025-11-23 08:52:28.285303919 +0000 UTC m=+0.335705416 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller) Nov 23 03:52:28 localhost podman[101629]: unhealthy Nov 23 03:52:28 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:52:28 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:52:51 localhost podman[101695]: 2025-11-23 08:52:51.02709473 +0000 UTC m=+0.079425961 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12) Nov 23 03:52:51 localhost podman[101695]: 2025-11-23 08:52:51.061319027 +0000 UTC m=+0.113650268 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, container_name=iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 23 03:52:51 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:52:51 localhost podman[101694]: 2025-11-23 08:52:51.068855488 +0000 UTC m=+0.123540231 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:52:51 localhost podman[101694]: 2025-11-23 08:52:51.148751623 +0000 UTC m=+0.203436396 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Nov 23 03:52:51 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:52:54 localhost systemd[1]: tmp-crun.7ipEdw.mount: Deactivated successfully. Nov 23 03:52:54 localhost podman[101734]: 2025-11-23 08:52:54.04404153 +0000 UTC m=+0.095810983 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute) Nov 23 03:52:54 localhost systemd[1]: tmp-crun.Cd8cve.mount: Deactivated successfully. Nov 23 03:52:54 localhost podman[101735]: 2025-11-23 08:52:54.090251995 +0000 UTC m=+0.138539422 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:52:54 localhost podman[101734]: 2025-11-23 08:52:54.143513685 +0000 UTC m=+0.195283128 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z) Nov 23 03:52:54 localhost podman[101733]: 2025-11-23 08:52:54.14401251 +0000 UTC m=+0.198271999 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64) Nov 23 03:52:54 localhost podman[101735]: 2025-11-23 08:52:54.148258269 +0000 UTC m=+0.196545346 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible) Nov 23 03:52:54 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:52:54 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:52:54 localhost podman[101733]: 2025-11-23 08:52:54.23026793 +0000 UTC m=+0.284527379 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:52:54 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:52:54 localhost podman[101740]: 2025-11-23 08:52:54.201308763 +0000 UTC m=+0.243973618 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:52:54 localhost podman[101740]: 2025-11-23 08:52:54.285242882 +0000 UTC m=+0.327907707 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Nov 23 03:52:54 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:52:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:52:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:52:56 localhost recover_tripleo_nova_virtqemud[101833]: 61756 Nov 23 03:52:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:52:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:52:57 localhost podman[101826]: 2025-11-23 08:52:57.031261292 +0000 UTC m=+0.075128980 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044) Nov 23 03:52:57 localhost podman[101826]: 2025-11-23 08:52:57.370565046 +0000 UTC m=+0.414432744 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:52:57 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:52:59 localhost podman[101851]: 2025-11-23 08:52:59.01131367 +0000 UTC m=+0.064772343 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12) Nov 23 03:52:59 localhost podman[101853]: 2025-11-23 08:52:59.082111487 +0000 UTC m=+0.131003700 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container) Nov 23 03:52:59 localhost podman[101852]: 2025-11-23 08:52:59.042097683 +0000 UTC m=+0.090161761 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:52:59 localhost podman[101853]: 2025-11-23 08:52:59.09916289 +0000 UTC m=+0.148055113 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 23 03:52:59 localhost podman[101853]: unhealthy Nov 23 03:52:59 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:52:59 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:52:59 localhost podman[101852]: 2025-11-23 08:52:59.12531351 +0000 UTC m=+0.173377628 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:52:59 localhost podman[101852]: unhealthy Nov 23 03:52:59 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:52:59 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:52:59 localhost podman[101851]: 2025-11-23 08:52:59.193041112 +0000 UTC m=+0.246499835 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:52:59 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:53:00 localhost systemd[1]: tmp-crun.jSEyR9.mount: Deactivated successfully. Nov 23 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:53:22 localhost podman[101915]: 2025-11-23 08:53:22.023438533 +0000 UTC m=+0.076451992 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 03:53:22 localhost podman[101915]: 2025-11-23 08:53:22.058189246 +0000 UTC m=+0.111202685 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:53:22 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:53:22 localhost podman[101914]: 2025-11-23 08:53:22.076716283 +0000 UTC m=+0.129458192 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd) Nov 23 03:53:22 localhost podman[101914]: 2025-11-23 08:53:22.089264397 +0000 UTC m=+0.142006296 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:53:22 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:53:25 localhost podman[101954]: 2025-11-23 08:53:25.031309097 +0000 UTC m=+0.086102385 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container) Nov 23 03:53:25 localhost podman[101954]: 2025-11-23 08:53:25.062369779 +0000 UTC m=+0.117163037 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:53:25 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:53:25 localhost podman[101955]: 2025-11-23 08:53:25.076926134 +0000 UTC m=+0.128626218 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:53:25 localhost podman[101953]: 2025-11-23 08:53:25.136159467 +0000 UTC m=+0.191849383 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:53:25 localhost podman[101953]: 2025-11-23 08:53:25.169282811 +0000 UTC m=+0.224972717 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z) Nov 23 03:53:25 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:53:25 localhost podman[101956]: 2025-11-23 08:53:25.181743642 +0000 UTC m=+0.229542307 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z) Nov 23 03:53:25 localhost podman[101956]: 2025-11-23 08:53:25.20521516 +0000 UTC m=+0.253013875 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:53:25 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:53:25 localhost podman[101955]: 2025-11-23 08:53:25.260132511 +0000 UTC m=+0.311832645 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:53:25 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:53:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:53:28 localhost systemd[1]: tmp-crun.2SBTFq.mount: Deactivated successfully. Nov 23 03:53:28 localhost podman[102114]: 2025-11-23 08:53:28.031817786 +0000 UTC m=+0.086839269 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container) Nov 23 03:53:28 localhost podman[102114]: 2025-11-23 08:53:28.412525837 +0000 UTC m=+0.467547330 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:53:28 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:53:30 localhost systemd[1]: tmp-crun.VOayAo.mount: Deactivated successfully. Nov 23 03:53:30 localhost podman[102153]: 2025-11-23 08:53:30.0276918 +0000 UTC m=+0.084837387 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64) Nov 23 03:53:30 localhost podman[102153]: 2025-11-23 08:53:30.045255068 +0000 UTC m=+0.102400645 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:53:30 localhost podman[102153]: unhealthy Nov 23 03:53:30 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:53:30 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:53:30 localhost systemd[1]: tmp-crun.zanDLi.mount: Deactivated successfully. Nov 23 03:53:30 localhost podman[102152]: 2025-11-23 08:53:30.137219282 +0000 UTC m=+0.195720461 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4) Nov 23 03:53:30 localhost podman[102154]: 2025-11-23 08:53:30.183349405 +0000 UTC m=+0.236895812 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public) Nov 23 03:53:30 localhost podman[102154]: 2025-11-23 08:53:30.200339355 +0000 UTC m=+0.253885762 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 03:53:30 localhost podman[102154]: unhealthy Nov 23 03:53:30 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:53:30 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:53:30 localhost podman[102152]: 2025-11-23 08:53:30.332422017 +0000 UTC m=+0.390923196 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:53:30 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:53:53 localhost podman[102221]: 2025-11-23 08:53:53.037247397 +0000 UTC m=+0.087784049 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 23 03:53:53 localhost systemd[1]: tmp-crun.z2TnJa.mount: Deactivated successfully. Nov 23 03:53:53 localhost podman[102222]: 2025-11-23 08:53:53.084965177 +0000 UTC m=+0.133288391 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:53:53 localhost podman[102222]: 2025-11-23 08:53:53.095306903 +0000 UTC m=+0.143630107 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible) Nov 23 03:53:53 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:53:53 localhost podman[102221]: 2025-11-23 08:53:53.147267403 +0000 UTC m=+0.197804005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:53:53 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:53:54 localhost sshd[102261]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:53:56 localhost systemd[1]: tmp-crun.DgKjyx.mount: Deactivated successfully. Nov 23 03:53:56 localhost podman[102264]: 2025-11-23 08:53:56.033762363 +0000 UTC m=+0.085234919 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:53:56 localhost podman[102264]: 2025-11-23 08:53:56.088607802 +0000 UTC m=+0.140080318 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4) Nov 23 03:53:56 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:53:56 localhost podman[102265]: 2025-11-23 08:53:56.110286896 +0000 UTC m=+0.153475068 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:53:56 localhost podman[102273]: 2025-11-23 08:53:56.075080688 +0000 UTC m=+0.116368863 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true) Nov 23 03:53:56 localhost podman[102265]: 2025-11-23 08:53:56.140328815 +0000 UTC m=+0.183517007 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:53:56 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:53:56 localhost podman[102263]: 2025-11-23 08:53:56.18427063 +0000 UTC m=+0.234558380 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:53:56 localhost podman[102273]: 2025-11-23 08:53:56.211170283 +0000 UTC m=+0.252458458 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 23 03:53:56 localhost podman[102263]: 2025-11-23 08:53:56.222372436 +0000 UTC m=+0.272660206 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:53:56 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:53:56 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:53:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:53:58 localhost podman[102363]: 2025-11-23 08:53:58.996447755 +0000 UTC m=+0.056072087 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:53:59 localhost podman[102363]: 2025-11-23 08:53:59.36421343 +0000 UTC m=+0.423837762 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:53:59 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:54:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:54:00 localhost recover_tripleo_nova_virtqemud[102400]: 61756 Nov 23 03:54:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:54:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:54:01 localhost podman[102387]: 2025-11-23 08:54:01.041521494 +0000 UTC m=+0.085866959 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:54:01 localhost podman[102386]: 2025-11-23 08:54:01.09042015 +0000 UTC m=+0.135076835 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:54:01 localhost podman[102387]: 2025-11-23 08:54:01.108592117 +0000 UTC m=+0.152937642 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:54:01 localhost podman[102388]: 2025-11-23 08:54:01.147818757 +0000 UTC m=+0.185521518 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., version=17.1.12) Nov 23 03:54:01 localhost podman[102387]: unhealthy Nov 23 03:54:01 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:54:01 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:54:01 localhost podman[102388]: 2025-11-23 08:54:01.183684775 +0000 UTC m=+0.221387526 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:54:01 localhost podman[102388]: unhealthy Nov 23 03:54:01 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:54:01 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:54:01 localhost podman[102386]: 2025-11-23 08:54:01.301121898 +0000 UTC m=+0.345778623 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 23 03:54:01 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:54:24 localhost podman[102453]: 2025-11-23 08:54:24.027209087 +0000 UTC m=+0.082944600 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:54:24 localhost podman[102453]: 2025-11-23 08:54:24.066406557 +0000 UTC m=+0.122142070 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, container_name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git) Nov 23 03:54:24 localhost podman[102454]: 2025-11-23 08:54:24.077932649 +0000 UTC m=+0.131862346 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid) Nov 23 03:54:24 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:54:24 localhost podman[102454]: 2025-11-23 08:54:24.117284814 +0000 UTC m=+0.171214491 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 23 03:54:24 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:54:27 localhost systemd[1]: tmp-crun.8KQFLh.mount: Deactivated successfully. Nov 23 03:54:27 localhost podman[102494]: 2025-11-23 08:54:27.025961743 +0000 UTC m=+0.078619927 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12) Nov 23 03:54:27 localhost podman[102494]: 2025-11-23 08:54:27.051132853 +0000 UTC m=+0.103791057 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public) Nov 23 03:54:27 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:54:27 localhost podman[102495]: 2025-11-23 08:54:27.099995209 +0000 UTC m=+0.146362030 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 23 03:54:27 localhost podman[102492]: 2025-11-23 08:54:27.182944467 +0000 UTC m=+0.237317263 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z) Nov 23 03:54:27 localhost podman[102492]: 2025-11-23 08:54:27.189510869 +0000 UTC m=+0.243883685 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, container_name=logrotate_crond, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:54:27 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:54:27 localhost podman[102493]: 2025-11-23 08:54:27.236145225 +0000 UTC m=+0.288156149 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Nov 23 03:54:27 localhost podman[102495]: 2025-11-23 08:54:27.254277441 +0000 UTC m=+0.300644252 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true) Nov 23 03:54:27 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:54:27 localhost podman[102493]: 2025-11-23 08:54:27.284257679 +0000 UTC m=+0.336268593 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64) Nov 23 03:54:27 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:54:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:54:30 localhost podman[102706]: 2025-11-23 08:54:30.036118698 +0000 UTC m=+0.090300295 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z) Nov 23 03:54:30 localhost podman[102706]: 2025-11-23 08:54:30.452308044 +0000 UTC m=+0.506489651 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:54:30 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:54:32 localhost podman[102745]: 2025-11-23 08:54:32.045965636 +0000 UTC m=+0.098457124 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z) Nov 23 03:54:32 localhost podman[102746]: 2025-11-23 08:54:32.092927934 +0000 UTC m=+0.142813212 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z) Nov 23 03:54:32 localhost podman[102744]: 2025-11-23 08:54:32.006078846 +0000 UTC m=+0.066229048 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container) Nov 23 03:54:32 localhost podman[102745]: 2025-11-23 08:54:32.111456721 +0000 UTC m=+0.163948129 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 23 03:54:32 localhost podman[102745]: unhealthy Nov 23 03:54:32 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:54:32 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:54:32 localhost podman[102746]: 2025-11-23 08:54:32.130207424 +0000 UTC m=+0.180092722 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1) Nov 23 03:54:32 localhost podman[102746]: unhealthy Nov 23 03:54:32 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:54:32 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:54:32 localhost podman[102744]: 2025-11-23 08:54:32.238919432 +0000 UTC m=+0.299069704 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Nov 23 03:54:32 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:54:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:54:55 localhost podman[102813]: 2025-11-23 08:54:55.028040828 +0000 UTC m=+0.080255958 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 23 03:54:55 localhost podman[102813]: 2025-11-23 08:54:55.041337345 +0000 UTC m=+0.093552545 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:54:55 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:54:55 localhost systemd[1]: tmp-crun.Y6wc9B.mount: Deactivated successfully. Nov 23 03:54:55 localhost podman[102814]: 2025-11-23 08:54:55.136987582 +0000 UTC m=+0.185515299 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, container_name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:54:55 localhost podman[102814]: 2025-11-23 08:54:55.150166435 +0000 UTC m=+0.198694092 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z) Nov 23 03:54:55 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:54:58 localhost systemd[1]: tmp-crun.yY7JFg.mount: Deactivated successfully. Nov 23 03:54:58 localhost podman[102857]: 2025-11-23 08:54:58.017298442 +0000 UTC m=+0.068679442 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, vcs-type=git) Nov 23 03:54:58 localhost podman[102856]: 2025-11-23 08:54:58.040940466 +0000 UTC m=+0.092435130 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute) Nov 23 03:54:58 localhost podman[102856]: 2025-11-23 08:54:58.062741043 +0000 UTC m=+0.114235727 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12) Nov 23 03:54:58 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:54:58 localhost podman[102857]: 2025-11-23 08:54:58.118912082 +0000 UTC m=+0.170293092 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z) Nov 23 03:54:58 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:54:58 localhost podman[102859]: 2025-11-23 08:54:58.120047867 +0000 UTC m=+0.166022762 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Nov 23 03:54:58 localhost podman[102855]: 2025-11-23 08:54:58.177503695 +0000 UTC m=+0.232475106 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 23 03:54:58 localhost podman[102855]: 2025-11-23 08:54:58.192308079 +0000 UTC m=+0.247279490 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12) Nov 23 03:54:58 localhost podman[102859]: 2025-11-23 08:54:58.200050825 +0000 UTC m=+0.246025750 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, container_name=nova_compute, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:54:58 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:54:58 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:54:59 localhost systemd[1]: tmp-crun.9pQOLV.mount: Deactivated successfully. Nov 23 03:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:55:01 localhost podman[102947]: 2025-11-23 08:55:01.01843852 +0000 UTC m=+0.080632398 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:55:01 localhost podman[102947]: 2025-11-23 08:55:01.374345853 +0000 UTC m=+0.436539661 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:55:01 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:55:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:55:03 localhost podman[102969]: 2025-11-23 08:55:03.040051331 +0000 UTC m=+0.094193884 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4) Nov 23 03:55:03 localhost podman[102970]: 2025-11-23 08:55:03.010379963 +0000 UTC m=+0.065263219 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:55:03 localhost podman[102970]: 2025-11-23 08:55:03.094017022 +0000 UTC m=+0.148900238 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Nov 23 03:55:03 localhost podman[102970]: unhealthy Nov 23 03:55:03 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:55:03 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:55:03 localhost podman[102971]: 2025-11-23 08:55:03.190121904 +0000 UTC m=+0.237625933 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, url=https://www.redhat.com) Nov 23 03:55:03 localhost podman[102971]: 2025-11-23 08:55:03.230324424 +0000 UTC m=+0.277828293 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:55:03 localhost podman[102971]: unhealthy Nov 23 03:55:03 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:55:03 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:55:03 localhost podman[102969]: 2025-11-23 08:55:03.270261706 +0000 UTC m=+0.324404239 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:55:03 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:55:04 localhost systemd[1]: tmp-crun.U3OxDM.mount: Deactivated successfully. Nov 23 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:55:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:55:26 localhost podman[103038]: 2025-11-23 08:55:26.023631197 +0000 UTC m=+0.080214905 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z) Nov 23 03:55:26 localhost podman[103038]: 2025-11-23 08:55:26.06127649 +0000 UTC m=+0.117860208 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Nov 23 03:55:26 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:55:26 localhost systemd[1]: tmp-crun.LjvwhE.mount: Deactivated successfully. Nov 23 03:55:26 localhost podman[103039]: 2025-11-23 08:55:26.102819652 +0000 UTC m=+0.155374587 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z) Nov 23 03:55:26 localhost podman[103039]: 2025-11-23 08:55:26.139411971 +0000 UTC m=+0.191966866 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:55:26 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:55:29 localhost systemd[1]: tmp-crun.dPOTkn.mount: Deactivated successfully. Nov 23 03:55:29 localhost podman[103078]: 2025-11-23 08:55:29.03797518 +0000 UTC m=+0.092545613 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, version=17.1.12) Nov 23 03:55:29 localhost podman[103082]: 2025-11-23 08:55:29.14478446 +0000 UTC m=+0.186600622 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:55:29 localhost podman[103079]: 2025-11-23 08:55:29.193250553 +0000 UTC m=+0.240861252 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:55:29 localhost podman[103077]: 2025-11-23 08:55:29.112415739 +0000 UTC m=+0.166647221 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1) Nov 23 03:55:29 localhost podman[103078]: 2025-11-23 08:55:29.218412553 +0000 UTC m=+0.272982986 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4) Nov 23 03:55:29 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:55:29 localhost podman[103077]: 2025-11-23 08:55:29.242207871 +0000 UTC m=+0.296439373 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:55:29 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:55:29 localhost podman[103079]: 2025-11-23 08:55:29.27224684 +0000 UTC m=+0.319857499 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 03:55:29 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:55:29 localhost podman[103082]: 2025-11-23 08:55:29.324011085 +0000 UTC m=+0.365827237 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:55:29 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:55:30 localhost systemd[1]: tmp-crun.axlcBa.mount: Deactivated successfully. Nov 23 03:55:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:55:32 localhost podman[103236]: 2025-11-23 08:55:32.023238573 +0000 UTC m=+0.081163175 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64) Nov 23 03:55:32 localhost podman[103236]: 2025-11-23 08:55:32.383801498 +0000 UTC m=+0.441726100 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:55:32 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:55:34 localhost podman[103273]: 2025-11-23 08:55:34.02806588 +0000 UTC m=+0.082867468 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=) Nov 23 03:55:34 localhost systemd[1]: tmp-crun.UQJa3z.mount: Deactivated successfully. Nov 23 03:55:34 localhost podman[103274]: 2025-11-23 08:55:34.135779436 +0000 UTC m=+0.188031726 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 23 03:55:34 localhost podman[103274]: 2025-11-23 08:55:34.147719581 +0000 UTC m=+0.199971871 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, tcib_managed=true, vcs-type=git, distribution-scope=public, version=17.1.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com) Nov 23 03:55:34 localhost podman[103274]: unhealthy Nov 23 03:55:34 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:55:34 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:55:34 localhost podman[103273]: 2025-11-23 08:55:34.198385492 +0000 UTC m=+0.253187130 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 23 03:55:34 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:55:34 localhost podman[103275]: 2025-11-23 08:55:34.148029361 +0000 UTC m=+0.197365611 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller) Nov 23 03:55:34 localhost podman[103275]: 2025-11-23 08:55:34.278502284 +0000 UTC m=+0.327838524 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Nov 23 03:55:34 localhost podman[103275]: unhealthy Nov 23 03:55:34 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:55:34 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:55:36 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:55:36 localhost recover_tripleo_nova_virtqemud[103340]: 61756 Nov 23 03:55:36 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:55:36 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:55:57 localhost systemd[1]: tmp-crun.7YzCQR.mount: Deactivated successfully. Nov 23 03:55:57 localhost podman[103341]: 2025-11-23 08:55:57.021072288 +0000 UTC m=+0.074820250 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4) Nov 23 03:55:57 localhost podman[103342]: 2025-11-23 08:55:57.058679879 +0000 UTC m=+0.109805461 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, container_name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container) Nov 23 03:55:57 localhost podman[103341]: 2025-11-23 08:55:57.087575003 +0000 UTC m=+0.141323005 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64) Nov 23 03:55:57 localhost podman[103342]: 2025-11-23 08:55:57.094395663 +0000 UTC m=+0.145521325 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, vendor=Red Hat, Inc.) Nov 23 03:55:57 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:55:57 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:56:00 localhost podman[103379]: 2025-11-23 08:56:00.002951968 +0000 UTC m=+0.064331739 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute) Nov 23 03:56:00 localhost podman[103378]: 2025-11-23 08:56:00.028304534 +0000 UTC m=+0.089200491 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Nov 23 03:56:00 localhost systemd[1]: tmp-crun.IA4DEL.mount: Deactivated successfully. Nov 23 03:56:00 localhost podman[103380]: 2025-11-23 08:56:00.074717364 +0000 UTC m=+0.131731622 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 23 03:56:00 localhost podman[103379]: 2025-11-23 08:56:00.080965656 +0000 UTC m=+0.142345327 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 23 03:56:00 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:56:00 localhost podman[103381]: 2025-11-23 08:56:00.133100901 +0000 UTC m=+0.187905122 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:56:00 localhost podman[103381]: 2025-11-23 08:56:00.15593781 +0000 UTC m=+0.210742031 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step5, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:56:00 localhost podman[103378]: 2025-11-23 08:56:00.163535583 +0000 UTC m=+0.224431610 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-cron, release=1761123044, io.buildah.version=1.41.4) Nov 23 03:56:00 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:56:00 localhost podman[103380]: 2025-11-23 08:56:00.174093036 +0000 UTC m=+0.231107344 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 23 03:56:00 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:56:00 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:56:00 localhost systemd[1]: tmp-crun.J0U24S.mount: Deactivated successfully. Nov 23 03:56:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:56:03 localhost podman[103476]: 2025-11-23 08:56:03.019348525 +0000 UTC m=+0.079590267 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, managed_by=tripleo_ansible) Nov 23 03:56:03 localhost podman[103476]: 2025-11-23 08:56:03.359796704 +0000 UTC m=+0.420038406 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:56:03 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:56:05 localhost podman[103498]: 2025-11-23 08:56:05.016856988 +0000 UTC m=+0.075223003 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:56:05 localhost podman[103499]: 2025-11-23 08:56:05.071678445 +0000 UTC m=+0.128560036 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:56:05 localhost systemd[1]: tmp-crun.PtKX6d.mount: Deactivated successfully. Nov 23 03:56:05 localhost podman[103500]: 2025-11-23 08:56:05.134382365 +0000 UTC m=+0.184608821 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller) Nov 23 03:56:05 localhost podman[103499]: 2025-11-23 08:56:05.157803431 +0000 UTC m=+0.214685052 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4) Nov 23 03:56:05 localhost podman[103499]: unhealthy Nov 23 03:56:05 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:56:05 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:56:05 localhost podman[103500]: 2025-11-23 08:56:05.17248529 +0000 UTC m=+0.222711766 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64) Nov 23 03:56:05 localhost podman[103500]: unhealthy Nov 23 03:56:05 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:56:05 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:56:05 localhost podman[103498]: 2025-11-23 08:56:05.230945339 +0000 UTC m=+0.289311284 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12) Nov 23 03:56:05 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:56:28 localhost systemd[1]: tmp-crun.f4WIRj.mount: Deactivated successfully. Nov 23 03:56:28 localhost podman[103567]: 2025-11-23 08:56:28.03035922 +0000 UTC m=+0.082336671 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Nov 23 03:56:28 localhost podman[103567]: 2025-11-23 08:56:28.037074885 +0000 UTC m=+0.089052336 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:56:28 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:56:28 localhost podman[103568]: 2025-11-23 08:56:28.014618088 +0000 UTC m=+0.067304421 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:56:28 localhost podman[103568]: 2025-11-23 08:56:28.093205393 +0000 UTC m=+0.145891756 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, architecture=x86_64, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:56:28 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:56:31 localhost systemd[1]: tmp-crun.sIrOHK.mount: Deactivated successfully. Nov 23 03:56:31 localhost podman[103603]: 2025-11-23 08:56:31.016969903 +0000 UTC m=+0.070974053 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 23 03:56:31 localhost podman[103602]: 2025-11-23 08:56:31.042412122 +0000 UTC m=+0.099976131 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, release=1761123044, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true) Nov 23 03:56:31 localhost podman[103602]: 2025-11-23 08:56:31.078690782 +0000 UTC m=+0.136254801 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, name=rhosp17/openstack-cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Nov 23 03:56:31 localhost podman[103603]: 2025-11-23 08:56:31.079080014 +0000 UTC m=+0.133084204 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4) Nov 23 03:56:31 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:56:31 localhost podman[103610]: 2025-11-23 08:56:31.080488947 +0000 UTC m=+0.127836713 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4) Nov 23 03:56:31 localhost podman[103604]: 2025-11-23 08:56:31.147629551 +0000 UTC m=+0.198915798 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64) Nov 23 03:56:31 localhost podman[103610]: 2025-11-23 08:56:31.159689221 +0000 UTC m=+0.207037017 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z) Nov 23 03:56:31 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:56:31 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:56:31 localhost podman[103604]: 2025-11-23 08:56:31.216874621 +0000 UTC m=+0.268160848 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:56:31 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:56:32 localhost systemd[1]: tmp-crun.YQ9Xtb.mount: Deactivated successfully. Nov 23 03:56:33 localhost podman[103801]: 2025-11-23 08:56:33.135443208 +0000 UTC m=+0.092759059 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, name=rhceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 03:56:33 localhost podman[103801]: 2025-11-23 08:56:33.240203554 +0000 UTC m=+0.197519365 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Nov 23 03:56:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:56:33 localhost podman[103851]: 2025-11-23 08:56:33.525374422 +0000 UTC m=+0.101247040 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:56:33 localhost podman[103851]: 2025-11-23 08:56:33.89813638 +0000 UTC m=+0.474008988 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 23 03:56:33 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:56:36 localhost systemd[1]: tmp-crun.q1E3e5.mount: Deactivated successfully. Nov 23 03:56:36 localhost podman[103968]: 2025-11-23 08:56:36.047583853 +0000 UTC m=+0.096992109 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 03:56:36 localhost podman[103969]: 2025-11-23 08:56:36.093682124 +0000 UTC m=+0.143863154 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:56:36 localhost podman[103969]: 2025-11-23 08:56:36.110077206 +0000 UTC m=+0.160258216 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Nov 23 03:56:36 localhost podman[103969]: unhealthy Nov 23 03:56:36 localhost podman[103968]: 2025-11-23 08:56:36.118333728 +0000 UTC m=+0.167741974 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 23 03:56:36 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:56:36 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:56:36 localhost podman[103968]: unhealthy Nov 23 03:56:36 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:56:36 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:56:36 localhost podman[103967]: 2025-11-23 08:56:36.188760444 +0000 UTC m=+0.239730869 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr) Nov 23 03:56:36 localhost podman[103967]: 2025-11-23 08:56:36.391547429 +0000 UTC m=+0.442517824 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:56:36 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:56:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:56:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:56:59 localhost podman[104036]: 2025-11-23 08:56:59.009631763 +0000 UTC m=+0.069401728 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:56:59 localhost podman[104036]: 2025-11-23 08:56:59.024099752 +0000 UTC m=+0.083869727 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 23 03:56:59 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:56:59 localhost systemd[1]: tmp-crun.EcwyoW.mount: Deactivated successfully. Nov 23 03:56:59 localhost podman[104035]: 2025-11-23 08:56:59.088212134 +0000 UTC m=+0.148600028 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 23 03:56:59 localhost podman[104035]: 2025-11-23 08:56:59.126279988 +0000 UTC m=+0.186667842 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, container_name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true) Nov 23 03:56:59 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:57:02 localhost podman[104074]: 2025-11-23 08:57:02.012582826 +0000 UTC m=+0.068262792 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 03:57:02 localhost podman[104083]: 2025-11-23 08:57:02.065702787 +0000 UTC m=+0.110959930 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.12, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 23 03:57:02 localhost podman[104074]: 2025-11-23 08:57:02.099668252 +0000 UTC m=+0.155348248 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4) Nov 23 03:57:02 localhost podman[104083]: 2025-11-23 08:57:02.118439716 +0000 UTC m=+0.163696899 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:57:02 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:57:02 localhost podman[104081]: 2025-11-23 08:57:02.134071002 +0000 UTC m=+0.178065365 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1) Nov 23 03:57:02 localhost podman[104081]: 2025-11-23 08:57:02.161194174 +0000 UTC m=+0.205188527 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 23 03:57:02 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:57:02 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:57:02 localhost systemd[1]: tmp-crun.MGlDjV.mount: Deactivated successfully. Nov 23 03:57:02 localhost sshd[104155]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:57:02 localhost podman[104075]: 2025-11-23 08:57:02.25311592 +0000 UTC m=+0.301485689 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute) Nov 23 03:57:02 localhost podman[104075]: 2025-11-23 08:57:02.283373471 +0000 UTC m=+0.331743230 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:57:02 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:57:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:57:04 localhost podman[104173]: 2025-11-23 08:57:04.021342206 +0000 UTC m=+0.081275667 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:57:04 localhost podman[104173]: 2025-11-23 08:57:04.357216322 +0000 UTC m=+0.417149733 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team) Nov 23 03:57:04 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:57:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:57:06 localhost recover_tripleo_nova_virtqemud[104208]: 61756 Nov 23 03:57:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:57:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:57:07 localhost systemd[1]: tmp-crun.pSJQai.mount: Deactivated successfully. Nov 23 03:57:07 localhost podman[104195]: 2025-11-23 08:57:07.058119197 +0000 UTC m=+0.103819437 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:57:07 localhost podman[104195]: 2025-11-23 08:57:07.095333893 +0000 UTC m=+0.141034173 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, release=1761123044, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:57:07 localhost podman[104195]: unhealthy Nov 23 03:57:07 localhost podman[104194]: 2025-11-23 08:57:07.102604409 +0000 UTC m=+0.152057715 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:57:07 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:57:07 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:57:07 localhost podman[104196]: 2025-11-23 08:57:07.145927046 +0000 UTC m=+0.187690823 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team) Nov 23 03:57:07 localhost podman[104196]: 2025-11-23 08:57:07.165581657 +0000 UTC m=+0.207345444 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:57:07 localhost podman[104196]: unhealthy Nov 23 03:57:07 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:57:07 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:57:07 localhost podman[104194]: 2025-11-23 08:57:07.291784238 +0000 UTC m=+0.341237474 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:57:07 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:57:16 localhost sshd[104262]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:57:21 localhost sshd[104266]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:57:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:57:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:57:30 localhost podman[104268]: 2025-11-23 08:57:30.02771943 +0000 UTC m=+0.082631779 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:57:30 localhost podman[104268]: 2025-11-23 08:57:30.06730626 +0000 UTC m=+0.122218589 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Nov 23 03:57:30 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:57:30 localhost podman[104269]: 2025-11-23 08:57:30.077140765 +0000 UTC m=+0.130675671 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:57:30 localhost podman[104269]: 2025-11-23 08:57:30.161109134 +0000 UTC m=+0.214644050 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64) Nov 23 03:57:30 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:57:33 localhost podman[104310]: 2025-11-23 08:57:33.028994449 +0000 UTC m=+0.076433395 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Nov 23 03:57:33 localhost podman[104310]: 2025-11-23 08:57:33.088535649 +0000 UTC m=+0.135974655 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:57:33 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:57:33 localhost podman[104308]: 2025-11-23 08:57:33.13587818 +0000 UTC m=+0.188929521 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4) Nov 23 03:57:33 localhost podman[104311]: 2025-11-23 08:57:33.090441999 +0000 UTC m=+0.135202512 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 23 03:57:33 localhost podman[104308]: 2025-11-23 08:57:33.145168299 +0000 UTC m=+0.198219580 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:57:33 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:57:33 localhost podman[104311]: 2025-11-23 08:57:33.170325501 +0000 UTC m=+0.215085954 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:36:58Z, release=1761123044, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 03:57:33 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Deactivated successfully. Nov 23 03:57:33 localhost podman[104309]: 2025-11-23 08:57:33.194915875 +0000 UTC m=+0.244852029 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1) Nov 23 03:57:33 localhost podman[104309]: 2025-11-23 08:57:33.226334801 +0000 UTC m=+0.276270985 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-type=git, url=https://www.redhat.com, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:57:33 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:57:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:57:35 localhost systemd[1]: tmp-crun.uv9ltS.mount: Deactivated successfully. Nov 23 03:57:35 localhost podman[104403]: 2025-11-23 08:57:35.005677582 +0000 UTC m=+0.065566539 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 23 03:57:35 localhost podman[104403]: 2025-11-23 08:57:35.391242012 +0000 UTC m=+0.451130959 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:57:35 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:57:38 localhost podman[104502]: 2025-11-23 08:57:38.034853699 +0000 UTC m=+0.081744710 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:57:38 localhost systemd[1]: tmp-crun.vYvEjI.mount: Deactivated successfully. Nov 23 03:57:38 localhost podman[104504]: 2025-11-23 08:57:38.102732218 +0000 UTC m=+0.146049539 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, container_name=ovn_controller, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:57:38 localhost podman[104503]: 2025-11-23 08:57:38.144823926 +0000 UTC m=+0.191573903 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team) Nov 23 03:57:38 localhost podman[104503]: 2025-11-23 08:57:38.161181145 +0000 UTC m=+0.207931072 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 23 03:57:38 localhost podman[104503]: unhealthy Nov 23 03:57:38 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:57:38 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:57:38 localhost podman[104504]: 2025-11-23 08:57:38.173316812 +0000 UTC m=+0.216634113 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team) Nov 23 03:57:38 localhost podman[104504]: unhealthy Nov 23 03:57:38 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:57:38 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:57:38 localhost podman[104502]: 2025-11-23 08:57:38.30232175 +0000 UTC m=+0.349212711 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 23 03:57:38 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:58:01 localhost podman[104567]: 2025-11-23 08:58:01.024949498 +0000 UTC m=+0.082589667 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid) Nov 23 03:58:01 localhost podman[104566]: 2025-11-23 08:58:01.062483285 +0000 UTC m=+0.122340903 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:58:01 localhost podman[104566]: 2025-11-23 08:58:01.071634239 +0000 UTC m=+0.131491877 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-collectd, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 23 03:58:01 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:58:01 localhost podman[104567]: 2025-11-23 08:58:01.11640226 +0000 UTC m=+0.174042409 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Nov 23 03:58:01 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:58:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:58:04 localhost podman[104605]: 2025-11-23 08:58:04.037064996 +0000 UTC m=+0.091387592 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 23 03:58:04 localhost podman[104613]: 2025-11-23 08:58:04.09676464 +0000 UTC m=+0.137581276 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container) Nov 23 03:58:04 localhost podman[104613]: 2025-11-23 08:58:04.14049216 +0000 UTC m=+0.181308776 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute) Nov 23 03:58:04 localhost podman[104613]: unhealthy Nov 23 03:58:04 localhost systemd[1]: tmp-crun.agXxkw.mount: Deactivated successfully. Nov 23 03:58:04 localhost podman[104606]: 2025-11-23 08:58:04.158059395 +0000 UTC m=+0.207496688 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:58:04 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:04 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:58:04 localhost podman[104607]: 2025-11-23 08:58:04.207704288 +0000 UTC m=+0.254563961 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 23 03:58:04 localhost podman[104606]: 2025-11-23 08:58:04.214326013 +0000 UTC m=+0.263763316 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:58:04 localhost podman[104605]: 2025-11-23 08:58:04.222691693 +0000 UTC m=+0.277014279 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044) Nov 23 03:58:04 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:58:04 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:58:04 localhost podman[104607]: 2025-11-23 08:58:04.267364142 +0000 UTC m=+0.314223795 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:58:04 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:58:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:58:06 localhost podman[104700]: 2025-11-23 08:58:06.026633628 +0000 UTC m=+0.077482779 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:58:06 localhost podman[104700]: 2025-11-23 08:58:06.369730659 +0000 UTC m=+0.420579780 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 23 03:58:06 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:58:09 localhost systemd[1]: tmp-crun.2HryfH.mount: Deactivated successfully. Nov 23 03:58:09 localhost podman[104723]: 2025-11-23 08:58:09.037083854 +0000 UTC m=+0.092543228 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Nov 23 03:58:09 localhost podman[104724]: 2025-11-23 08:58:09.087688826 +0000 UTC m=+0.140873179 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com) Nov 23 03:58:09 localhost podman[104724]: 2025-11-23 08:58:09.100856274 +0000 UTC m=+0.154040637 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, batch=17.1_20251118.1) Nov 23 03:58:09 localhost podman[104724]: unhealthy Nov 23 03:58:09 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:09 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:58:09 localhost podman[104725]: 2025-11-23 08:58:09.188600042 +0000 UTC m=+0.239268177 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller) Nov 23 03:58:09 localhost podman[104725]: 2025-11-23 08:58:09.205169937 +0000 UTC m=+0.255838042 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 23 03:58:09 localhost podman[104725]: unhealthy Nov 23 03:58:09 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:09 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:58:09 localhost podman[104723]: 2025-11-23 08:58:09.252325631 +0000 UTC m=+0.307784985 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 23 03:58:09 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:58:10 localhost systemd[1]: tmp-crun.3W44de.mount: Deactivated successfully. Nov 23 03:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:58:32 localhost podman[104792]: 2025-11-23 08:58:32.026279561 +0000 UTC m=+0.080176873 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:58:32 localhost podman[104792]: 2025-11-23 08:58:32.037238502 +0000 UTC m=+0.091135814 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:58:32 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:58:32 localhost podman[104791]: 2025-11-23 08:58:32.122307255 +0000 UTC m=+0.178381474 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1) Nov 23 03:58:32 localhost podman[104791]: 2025-11-23 08:58:32.157283072 +0000 UTC m=+0.213357261 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 03:58:32 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:58:35 localhost systemd[1]: tmp-crun.fvgcfK.mount: Deactivated successfully. Nov 23 03:58:35 localhost podman[104831]: 2025-11-23 08:58:35.05258702 +0000 UTC m=+0.101791544 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:58:35 localhost systemd[1]: tmp-crun.QeDz2n.mount: Deactivated successfully. Nov 23 03:58:35 localhost podman[104833]: 2025-11-23 08:58:35.100087416 +0000 UTC m=+0.142415297 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 23 03:58:35 localhost podman[104831]: 2025-11-23 08:58:35.133342569 +0000 UTC m=+0.182547033 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 23 03:58:35 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:58:35 localhost podman[104830]: 2025-11-23 08:58:35.138414617 +0000 UTC m=+0.188319542 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 23 03:58:35 localhost podman[104832]: 2025-11-23 08:58:35.205282435 +0000 UTC m=+0.250213917 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, version=17.1.12) Nov 23 03:58:35 localhost podman[104830]: 2025-11-23 08:58:35.219193347 +0000 UTC m=+0.269098292 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Nov 23 03:58:35 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:58:35 localhost podman[104832]: 2025-11-23 08:58:35.258320173 +0000 UTC m=+0.303251685 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 03:58:35 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:58:35 localhost podman[104833]: 2025-11-23 08:58:35.271326887 +0000 UTC m=+0.313654768 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1) Nov 23 03:58:35 localhost podman[104833]: unhealthy Nov 23 03:58:35 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:35 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:58:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:58:36 localhost systemd[1]: tmp-crun.DGio3p.mount: Deactivated successfully. Nov 23 03:58:36 localhost podman[104940]: 2025-11-23 08:58:36.902287597 +0000 UTC m=+0.102595159 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:58:37 localhost podman[104940]: 2025-11-23 08:58:37.302325057 +0000 UTC m=+0.502632620 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 23 03:58:37 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:58:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:58:40 localhost podman[105027]: 2025-11-23 08:58:40.030085899 +0000 UTC m=+0.078360606 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 23 03:58:40 localhost podman[105028]: 2025-11-23 08:58:40.092013793 +0000 UTC m=+0.140544168 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 23 03:58:40 localhost podman[105028]: 2025-11-23 08:58:40.106277026 +0000 UTC m=+0.154807421 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Nov 23 03:58:40 localhost podman[105028]: unhealthy Nov 23 03:58:40 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:40 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:58:40 localhost systemd[1]: tmp-crun.v9VJvr.mount: Deactivated successfully. Nov 23 03:58:40 localhost podman[105029]: 2025-11-23 08:58:40.23898589 +0000 UTC m=+0.285214263 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 23 03:58:40 localhost podman[105029]: 2025-11-23 08:58:40.253642716 +0000 UTC m=+0.299871029 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:58:40 localhost podman[105029]: unhealthy Nov 23 03:58:40 localhost podman[105027]: 2025-11-23 08:58:40.260680164 +0000 UTC m=+0.308954861 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:58:40 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:58:40 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:58:40 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:58:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 03:58:56 localhost recover_tripleo_nova_virtqemud[105098]: 61756 Nov 23 03:58:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 03:58:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 03:59:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:59:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:59:03 localhost podman[105100]: 2025-11-23 08:59:03.035689108 +0000 UTC m=+0.083453914 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 03:59:03 localhost podman[105100]: 2025-11-23 08:59:03.076169587 +0000 UTC m=+0.123934343 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:59:03 localhost systemd[1]: tmp-crun.OzLK2F.mount: Deactivated successfully. Nov 23 03:59:03 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:59:03 localhost podman[105099]: 2025-11-23 08:59:03.095740615 +0000 UTC m=+0.145943507 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 03:59:03 localhost podman[105099]: 2025-11-23 08:59:03.179673703 +0000 UTC m=+0.229876625 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:59:03 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:59:03 localhost sshd[105138]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:59:03 localhost systemd-logind[761]: New session 36 of user zuul. Nov 23 03:59:03 localhost systemd[1]: Started Session 36 of User zuul. Nov 23 03:59:04 localhost python3.9[105233]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 03:59:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:59:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:59:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:59:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:59:05 localhost podman[105331]: 2025-11-23 08:59:05.522544613 +0000 UTC m=+0.067037574 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 23 03:59:05 localhost podman[105329]: 2025-11-23 08:59:05.588972668 +0000 UTC m=+0.134125469 container health_status 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 23 03:59:05 localhost python3.9[105327]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:59:05 localhost podman[105328]: 2025-11-23 08:59:05.638878607 +0000 UTC m=+0.180505559 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:32Z) Nov 23 03:59:05 localhost podman[105328]: 2025-11-23 08:59:05.646383871 +0000 UTC m=+0.188010863 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, version=17.1.12, container_name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4) Nov 23 03:59:05 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:59:05 localhost podman[105329]: 2025-11-23 08:59:05.667025492 +0000 UTC m=+0.212178293 container exec_died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:59:05 localhost podman[105331]: 2025-11-23 08:59:05.669735596 +0000 UTC m=+0.214228637 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Nov 23 03:59:05 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Deactivated successfully. Nov 23 03:59:05 localhost podman[105330]: 2025-11-23 08:59:05.567420618 +0000 UTC m=+0.112957081 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Nov 23 03:59:05 localhost podman[105330]: 2025-11-23 08:59:05.74805559 +0000 UTC m=+0.293592033 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, architecture=x86_64) Nov 23 03:59:05 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:59:05 localhost podman[105331]: unhealthy Nov 23 03:59:05 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:05 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:59:06 localhost python3.9[105515]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 03:59:07 localhost python3.9[105609]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:59:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:59:07 localhost systemd[1]: tmp-crun.xJGjog.mount: Deactivated successfully. Nov 23 03:59:07 localhost podman[105703]: 2025-11-23 08:59:07.731795772 +0000 UTC m=+0.067741276 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:59:07 localhost python3.9[105702]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 03:59:08 localhost podman[105703]: 2025-11-23 08:59:08.086345109 +0000 UTC m=+0.422290643 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 03:59:08 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:59:08 localhost python3.9[105816]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 23 03:59:10 localhost python3.9[105906]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:59:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:59:11 localhost podman[105938]: 2025-11-23 08:59:11.049773954 +0000 UTC m=+0.094108896 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git) Nov 23 03:59:11 localhost podman[105937]: 2025-11-23 08:59:11.011065301 +0000 UTC m=+0.060930655 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12) Nov 23 03:59:11 localhost podman[105938]: 2025-11-23 08:59:11.094464952 +0000 UTC m=+0.138799874 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 03:59:11 localhost podman[105938]: unhealthy Nov 23 03:59:11 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:11 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:59:11 localhost podman[105936]: 2025-11-23 08:59:11.112020168 +0000 UTC m=+0.161317514 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, container_name=metrics_qdr) Nov 23 03:59:11 localhost podman[105937]: 2025-11-23 08:59:11.1449325 +0000 UTC m=+0.194797864 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 23 03:59:11 localhost podman[105937]: unhealthy Nov 23 03:59:11 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:11 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:59:11 localhost podman[105936]: 2025-11-23 08:59:11.31936071 +0000 UTC m=+0.368658036 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 23 03:59:11 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:59:11 localhost python3.9[106065]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 23 03:59:12 localhost python3.9[106155]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 03:59:13 localhost python3.9[106203]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 03:59:13 localhost systemd[1]: session-36.scope: Deactivated successfully. Nov 23 03:59:13 localhost systemd[1]: session-36.scope: Consumed 4.717s CPU time. Nov 23 03:59:13 localhost systemd-logind[761]: Session 36 logged out. Waiting for processes to exit. Nov 23 03:59:13 localhost systemd-logind[761]: Removed session 36. Nov 23 03:59:20 localhost sshd[106219]: main: sshd: ssh-rsa algorithm is disabled Nov 23 03:59:20 localhost systemd-logind[761]: New session 37 of user zuul. Nov 23 03:59:20 localhost systemd[1]: Started Session 37 of User zuul. Nov 23 03:59:22 localhost python3.9[106314]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 03:59:22 localhost systemd[1]: Reloading. Nov 23 03:59:22 localhost systemd-rc-local-generator[106337]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:59:22 localhost systemd-sysv-generator[106343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:59:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:59:23 localhost python3.9[106440]: ansible-ansible.builtin.service_facts Invoked Nov 23 03:59:23 localhost network[106457]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 03:59:23 localhost network[106458]: 'network-scripts' will be removed from distribution in near future. Nov 23 03:59:23 localhost network[106459]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 03:59:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:59:27 localhost python3.9[106656]: ansible-ansible.builtin.service_facts Invoked Nov 23 03:59:27 localhost network[106673]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 03:59:27 localhost network[106674]: 'network-scripts' will be removed from distribution in near future. Nov 23 03:59:27 localhost network[106675]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 03:59:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:59:31 localhost python3.9[106874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 03:59:31 localhost systemd[1]: Reloading. Nov 23 03:59:31 localhost systemd-rc-local-generator[106898]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 03:59:31 localhost systemd-sysv-generator[106906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 03:59:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 03:59:31 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 23 03:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 03:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 03:59:34 localhost podman[106929]: 2025-11-23 08:59:34.028065327 +0000 UTC m=+0.079665516 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1) Nov 23 03:59:34 localhost podman[106928]: 2025-11-23 08:59:34.077734551 +0000 UTC m=+0.132364574 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4) Nov 23 03:59:34 localhost podman[106928]: 2025-11-23 08:59:34.0893113 +0000 UTC m=+0.143941323 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z) Nov 23 03:59:34 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 03:59:34 localhost podman[106929]: 2025-11-23 08:59:34.142765141 +0000 UTC m=+0.194365280 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 23 03:59:34 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 03:59:36 localhost podman[106969]: 2025-11-23 08:59:36.013238851 +0000 UTC m=+0.062514653 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 23 03:59:36 localhost podman[106968]: Error: container 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 is not running Nov 23 03:59:36 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=125/n/a Nov 23 03:59:36 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'. Nov 23 03:59:36 localhost systemd[1]: tmp-crun.wzhc5v.mount: Deactivated successfully. Nov 23 03:59:36 localhost podman[106979]: 2025-11-23 08:59:36.078976854 +0000 UTC m=+0.120163814 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 03:59:36 localhost podman[106969]: 2025-11-23 08:59:36.085726194 +0000 UTC m=+0.135002066 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 23 03:59:36 localhost podman[106979]: 2025-11-23 08:59:36.095182637 +0000 UTC m=+0.136369637 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, container_name=nova_compute, config_id=tripleo_step5, batch=17.1_20251118.1) Nov 23 03:59:36 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 03:59:36 localhost podman[106979]: unhealthy Nov 23 03:59:36 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:36 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 03:59:36 localhost podman[106967]: 2025-11-23 08:59:36.164652886 +0000 UTC m=+0.216844328 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 23 03:59:36 localhost podman[106967]: 2025-11-23 08:59:36.172376466 +0000 UTC m=+0.224567998 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:59:36 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 03:59:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 03:59:38 localhost podman[107048]: 2025-11-23 08:59:38.253173134 +0000 UTC m=+0.064758674 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 03:59:38 localhost podman[107048]: 2025-11-23 08:59:38.671486492 +0000 UTC m=+0.483072072 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=) Nov 23 03:59:38 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 03:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 03:59:41 localhost systemd[1]: tmp-crun.0y2C73.mount: Deactivated successfully. Nov 23 03:59:41 localhost podman[107148]: 2025-11-23 08:59:41.545508559 +0000 UTC m=+0.099032039 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 03:59:41 localhost podman[107149]: 2025-11-23 08:59:41.642586895 +0000 UTC m=+0.193864655 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 03:59:41 localhost podman[107150]: 2025-11-23 08:59:41.614024167 +0000 UTC m=+0.161473168 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 03:59:41 localhost podman[107149]: 2025-11-23 08:59:41.686432108 +0000 UTC m=+0.237709858 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com) Nov 23 03:59:41 localhost podman[107149]: unhealthy Nov 23 03:59:41 localhost podman[107150]: 2025-11-23 08:59:41.697337266 +0000 UTC m=+0.244786327 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller) Nov 23 03:59:41 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:41 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 03:59:41 localhost podman[107150]: unhealthy Nov 23 03:59:41 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 03:59:41 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 03:59:41 localhost podman[107148]: 2025-11-23 08:59:41.74541025 +0000 UTC m=+0.298933720 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Nov 23 03:59:41 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 03:59:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2621 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B423260000000001030307) Nov 23 03:59:42 localhost systemd[1]: tmp-crun.eT6Kbg.mount: Deactivated successfully. Nov 23 03:59:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2622 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B427200000000001030307) Nov 23 03:59:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2623 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B42F200000000001030307) Nov 23 03:59:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10032 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4346B0000000001030307) Nov 23 03:59:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10033 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B438600000000001030307) Nov 23 03:59:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2624 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B43EE00000000001030307) Nov 23 03:59:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10034 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B440600000000001030307) Nov 23 03:59:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10035 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B450200000000001030307) Nov 23 03:59:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2625 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B460210000000001030307) Nov 23 03:59:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20543 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4692A0000000001030307) Nov 23 03:59:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26845 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B469DA0000000001030307) Nov 23 04:00:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20544 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B46D200000000001030307) Nov 23 04:00:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26846 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B46DE10000000001030307) Nov 23 04:00:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10036 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B470210000000001030307) Nov 23 04:00:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25363 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B472500000000001030307) Nov 23 04:00:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20545 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B475210000000001030307) Nov 23 04:00:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26847 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B475E00000000001030307) Nov 23 04:00:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25364 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B476610000000001030307) Nov 23 04:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 04:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 04:00:04 localhost podman[107221]: 2025-11-23 09:00:04.295199577 +0000 UTC m=+0.082536717 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vcs-type=git) Nov 23 04:00:04 localhost podman[107221]: 2025-11-23 09:00:04.329610646 +0000 UTC m=+0.116947796 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044) Nov 23 04:00:04 localhost podman[107220]: 2025-11-23 09:00:04.344389064 +0000 UTC m=+0.134420267 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 23 04:00:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 04:00:04 localhost podman[107220]: 2025-11-23 09:00:04.349939987 +0000 UTC m=+0.139971220 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12) Nov 23 04:00:04 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 04:00:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25365 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B47E600000000001030307) Nov 23 04:00:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 04:00:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 04:00:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 04:00:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 04:00:06 localhost podman[107261]: 2025-11-23 09:00:06.522445773 +0000 UTC m=+0.071593096 container health_status d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:12:45Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible) Nov 23 04:00:06 localhost podman[107261]: 2025-11-23 09:00:06.578290329 +0000 UTC m=+0.127437652 container exec_died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com) Nov 23 04:00:06 localhost systemd[1]: tmp-crun.YLWXYc.mount: Deactivated successfully. Nov 23 04:00:06 localhost podman[107262]: 2025-11-23 09:00:06.591089396 +0000 UTC m=+0.140892099 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4) Nov 23 04:00:06 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Deactivated successfully. Nov 23 04:00:06 localhost podman[107260]: Error: container 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 is not running Nov 23 04:00:06 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Main process exited, code=exited, status=125/n/a Nov 23 04:00:06 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed with result 'exit-code'. Nov 23 04:00:06 localhost podman[107259]: 2025-11-23 09:00:06.683177397 +0000 UTC m=+0.239603645 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond) Nov 23 04:00:06 localhost podman[107262]: 2025-11-23 09:00:06.709165575 +0000 UTC m=+0.258968328 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:00:06 localhost podman[107262]: unhealthy Nov 23 04:00:06 localhost podman[107259]: 2025-11-23 09:00:06.717593237 +0000 UTC m=+0.274019435 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron) Nov 23 04:00:06 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:06 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 04:00:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 04:00:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20546 DF PROTO=TCP SPT=54086 DPT=9101 SEQ=1420382078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B484E00000000001030307) Nov 23 04:00:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26848 DF PROTO=TCP SPT=47682 DPT=9105 SEQ=4017651394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B485A00000000001030307) Nov 23 04:00:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 04:00:09 localhost systemd[1]: tmp-crun.sbcHqp.mount: Deactivated successfully. Nov 23 04:00:09 localhost podman[107338]: 2025-11-23 09:00:09.016143121 +0000 UTC m=+0.073111623 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 23 04:00:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25366 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B48E210000000001030307) Nov 23 04:00:09 localhost podman[107338]: 2025-11-23 09:00:09.397425968 +0000 UTC m=+0.454394450 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 23 04:00:09 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 04:00:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33852 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B498560000000001030307) Nov 23 04:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 04:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:00:12 localhost podman[107361]: 2025-11-23 09:00:12.029484804 +0000 UTC m=+0.079823431 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Nov 23 04:00:12 localhost systemd[1]: tmp-crun.vtmvH2.mount: Deactivated successfully. Nov 23 04:00:12 localhost podman[107363]: 2025-11-23 09:00:12.103453383 +0000 UTC m=+0.146630237 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 04:00:12 localhost podman[107362]: 2025-11-23 09:00:12.140504534 +0000 UTC m=+0.187093865 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 23 04:00:12 localhost podman[107363]: 2025-11-23 09:00:12.187286178 +0000 UTC m=+0.230463002 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12) Nov 23 04:00:12 localhost podman[107363]: unhealthy Nov 23 04:00:12 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:12 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:00:12 localhost podman[107362]: 2025-11-23 09:00:12.20696985 +0000 UTC m=+0.253559141 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 23 04:00:12 localhost podman[107362]: unhealthy Nov 23 04:00:12 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:12 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:00:12 localhost podman[107361]: 2025-11-23 09:00:12.219638033 +0000 UTC m=+0.269976620 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 23 04:00:12 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 04:00:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33853 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B49C600000000001030307) Nov 23 04:00:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2626 DF PROTO=TCP SPT=37116 DPT=9882 SEQ=3632425475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4A0200000000001030307) Nov 23 04:00:13 localhost podman[106914]: time="2025-11-23T09:00:13Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Nov 23 04:00:13 localhost systemd[1]: libpod-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Deactivated successfully. Nov 23 04:00:13 localhost systemd[1]: libpod-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Consumed 6.235s CPU time. Nov 23 04:00:13 localhost podman[106914]: 2025-11-23 09:00:13.995664431 +0000 UTC m=+42.091552544 container stop 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 04:00:14 localhost podman[106914]: 2025-11-23 09:00:14.03328767 +0000 UTC m=+42.129175763 container died 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Deactivated successfully. Nov 23 04:00:14 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9. Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory Nov 23 04:00:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9-userdata-shm.mount: Deactivated successfully. Nov 23 04:00:14 localhost systemd[1]: var-lib-containers-storage-overlay-43834aabac3051c95b0bd48b6a3d7296604e45656eac8be0b6aa4803a8bc68b2-merged.mount: Deactivated successfully. Nov 23 04:00:14 localhost podman[106914]: 2025-11-23 09:00:14.138839779 +0000 UTC m=+42.234727802 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 23 04:00:14 localhost podman[106914]: ceilometer_agent_compute Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: No such file or directory Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory Nov 23 04:00:14 localhost podman[107431]: 2025-11-23 09:00:14.153701502 +0000 UTC m=+0.143861592 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 23 04:00:14 localhost systemd[1]: libpod-conmon-6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.scope: Deactivated successfully. Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.timer: No such file or directory Nov 23 04:00:14 localhost systemd[1]: 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: Failed to open /run/systemd/transient/6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9.service: No such file or directory Nov 23 04:00:14 localhost podman[107445]: 2025-11-23 09:00:14.262501293 +0000 UTC m=+0.079147201 container cleanup 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 23 04:00:14 localhost podman[107445]: ceilometer_agent_compute Nov 23 04:00:14 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Nov 23 04:00:14 localhost systemd[1]: Stopped ceilometer_agent_compute container. Nov 23 04:00:14 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.048s CPU time, no IO. Nov 23 04:00:14 localhost python3.9[107549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:00:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33854 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4A4600000000001030307) Nov 23 04:00:14 localhost systemd[1]: Reloading. Nov 23 04:00:15 localhost systemd-sysv-generator[107579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:00:15 localhost systemd-rc-local-generator[107576]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:00:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:00:15 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Nov 23 04:00:16 localhost sshd[107603]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:00:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10037 DF PROTO=TCP SPT=44840 DPT=9102 SEQ=2443414713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4B0210000000001030307) Nov 23 04:00:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19456 DF PROTO=TCP SPT=35234 DPT=9102 SEQ=3403046721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4C5610000000001030307) Nov 23 04:00:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:5b:6f:0e MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=41938 SEQ=2615091442 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 23 04:00:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33856 DF PROTO=TCP SPT=33872 DPT=9882 SEQ=413915981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4D4210000000001030307) Nov 23 04:00:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32818 DF PROTO=TCP SPT=41184 DPT=9105 SEQ=2384712410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4DF0B0000000001030307) Nov 23 04:00:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14926 DF PROTO=TCP SPT=45750 DPT=9101 SEQ=1924788322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4EA610000000001030307) Nov 23 04:00:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 04:00:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 04:00:34 localhost podman[107605]: 2025-11-23 09:00:34.528804116 +0000 UTC m=+0.084424794 container health_status 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 23 04:00:34 localhost podman[107606]: 2025-11-23 09:00:34.57399742 +0000 UTC m=+0.128990909 container health_status 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1) Nov 23 04:00:34 localhost podman[107605]: 2025-11-23 09:00:34.595012033 +0000 UTC m=+0.150632751 container exec_died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1) Nov 23 04:00:34 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Deactivated successfully. Nov 23 04:00:34 localhost podman[107606]: 2025-11-23 09:00:34.610304478 +0000 UTC m=+0.165297947 container exec_died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 23 04:00:34 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Deactivated successfully. Nov 23 04:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 04:00:36 localhost podman[107644]: Error: container d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 is not running Nov 23 04:00:36 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Main process exited, code=exited, status=125/n/a Nov 23 04:00:36 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed with result 'exit-code'. Nov 23 04:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 04:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 04:00:36 localhost podman[107655]: 2025-11-23 09:00:36.81924929 +0000 UTC m=+0.050560122 container health_status 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond) Nov 23 04:00:36 localhost podman[107655]: 2025-11-23 09:00:36.857331142 +0000 UTC m=+0.088641994 container exec_died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true) Nov 23 04:00:36 localhost systemd[1]: tmp-crun.3EnnxA.mount: Deactivated successfully. Nov 23 04:00:36 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Deactivated successfully. Nov 23 04:00:36 localhost podman[107656]: 2025-11-23 09:00:36.874611549 +0000 UTC m=+0.098320226 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container) Nov 23 04:00:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14927 DF PROTO=TCP SPT=45750 DPT=9101 SEQ=1924788322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B4FA200000000001030307) Nov 23 04:00:36 localhost podman[107656]: 2025-11-23 09:00:36.914313154 +0000 UTC m=+0.138021821 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_compute, io.buildah.version=1.41.4) Nov 23 04:00:36 localhost podman[107656]: unhealthy Nov 23 04:00:36 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:36 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 04:00:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3890 DF PROTO=TCP SPT=48150 DPT=9100 SEQ=3852951719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B503610000000001030307) Nov 23 04:00:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 04:00:39 localhost systemd[1]: tmp-crun.0GqwL5.mount: Deactivated successfully. Nov 23 04:00:39 localhost podman[107695]: 2025-11-23 09:00:39.531789778 +0000 UTC m=+0.085060925 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container) Nov 23 04:00:39 localhost podman[107695]: 2025-11-23 09:00:39.90258957 +0000 UTC m=+0.455860697 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute) Nov 23 04:00:39 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 04:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 04:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:00:42 localhost podman[107796]: 2025-11-23 09:00:42.796251915 +0000 UTC m=+0.099550394 container health_status 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64) Nov 23 04:00:42 localhost podman[107798]: 2025-11-23 09:00:42.828456476 +0000 UTC m=+0.119275068 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:00:42 localhost systemd[1]: tmp-crun.o2kKJ4.mount: Deactivated successfully. Nov 23 04:00:42 localhost podman[107797]: 2025-11-23 09:00:42.861315896 +0000 UTC m=+0.161089006 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044) Nov 23 04:00:42 localhost podman[107798]: 2025-11-23 09:00:42.872315428 +0000 UTC m=+0.163133990 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:00:42 localhost podman[107798]: unhealthy Nov 23 04:00:42 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:42 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:00:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54848 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B511A00000000001030307) Nov 23 04:00:42 localhost podman[107797]: 2025-11-23 09:00:42.925555543 +0000 UTC m=+0.225328643 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent) Nov 23 04:00:42 localhost podman[107797]: unhealthy Nov 23 04:00:42 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:00:42 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:00:43 localhost podman[107796]: 2025-11-23 09:00:43.005242959 +0000 UTC m=+0.308541438 container exec_died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Nov 23 04:00:43 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Deactivated successfully. Nov 23 04:00:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54849 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B519A00000000001030307) Nov 23 04:00:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19458 DF PROTO=TCP SPT=35234 DPT=9102 SEQ=3403046721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B526200000000001030307) Nov 23 04:00:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44633 DF PROTO=TCP SPT=51672 DPT=9102 SEQ=4211352798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B53AA00000000001030307) Nov 23 04:00:54 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 04:00:54 localhost recover_tripleo_nova_virtqemud[107865]: 61756 Nov 23 04:00:54 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 04:00:54 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 04:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54851 DF PROTO=TCP SPT=33826 DPT=9882 SEQ=955523070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B54A200000000001030307) Nov 23 04:00:57 localhost podman[107589]: time="2025-11-23T09:00:57Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Nov 23 04:00:57 localhost systemd[1]: libpod-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Deactivated successfully. Nov 23 04:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:5b:6f:0e MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.107 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=41944 SEQ=2074507670 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Nov 23 04:00:57 localhost systemd[1]: libpod-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Consumed 6.186s CPU time. Nov 23 04:00:57 localhost podman[107589]: 2025-11-23 09:00:57.44946856 +0000 UTC m=+42.092499030 container stop d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 23 04:00:57 localhost podman[107589]: 2025-11-23 09:00:57.483054263 +0000 UTC m=+42.126084743 container died d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Deactivated successfully. Nov 23 04:00:57 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8. Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory Nov 23 04:00:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8-userdata-shm.mount: Deactivated successfully. Nov 23 04:00:57 localhost podman[107589]: 2025-11-23 09:00:57.591285536 +0000 UTC m=+42.234315976 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git) Nov 23 04:00:57 localhost podman[107589]: ceilometer_agent_ipmi Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: No such file or directory Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory Nov 23 04:00:57 localhost podman[107867]: 2025-11-23 09:00:57.60877318 +0000 UTC m=+0.145951396 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 23 04:00:57 localhost systemd[1]: libpod-conmon-d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.scope: Deactivated successfully. Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.timer: No such file or directory Nov 23 04:00:57 localhost systemd[1]: d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: Failed to open /run/systemd/transient/d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8.service: No such file or directory Nov 23 04:00:57 localhost podman[107882]: 2025-11-23 09:00:57.713145913 +0000 UTC m=+0.069578664 container cleanup d0f474ab61bd1bdc917ea71c15791f862eb67c8a5b6bd75b8ab69d98e7fe7bc8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 23 04:00:57 localhost podman[107882]: ceilometer_agent_ipmi Nov 23 04:00:57 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Nov 23 04:00:57 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Nov 23 04:00:58 localhost systemd[1]: var-lib-containers-storage-overlay-a8785be8dea5fa0361315af1fc74fe453e62e737d2ff3d773f6811b45d15cd9a-merged.mount: Deactivated successfully. Nov 23 04:00:58 localhost python3.9[107984]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:00:58 localhost systemd[1]: Reloading. Nov 23 04:00:58 localhost systemd-sysv-generator[108009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:00:58 localhost systemd-rc-local-generator[108006]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:00:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:00:58 localhost systemd[1]: Stopping collectd container... Nov 23 04:00:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48175 DF PROTO=TCP SPT=38494 DPT=9105 SEQ=3740944446 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5543B0000000001030307) Nov 23 04:01:02 localhost systemd[1]: libpod-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Deactivated successfully. Nov 23 04:01:02 localhost systemd[1]: libpod-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Consumed 2.080s CPU time. Nov 23 04:01:02 localhost podman[108024]: 2025-11-23 09:01:02.641099993 +0000 UTC m=+3.758452891 container died 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, container_name=collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Deactivated successfully. Nov 23 04:01:02 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb. Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory Nov 23 04:01:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:02 localhost podman[108024]: 2025-11-23 09:01:02.759037167 +0000 UTC m=+3.876390065 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, distribution-scope=public, version=17.1.12, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:02 localhost podman[108024]: collectd Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: No such file or directory Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory Nov 23 04:01:02 localhost podman[108063]: 2025-11-23 09:01:02.783932131 +0000 UTC m=+0.131741955 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:01:02 localhost systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:02 localhost systemd[1]: libpod-conmon-6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.scope: Deactivated successfully. Nov 23 04:01:02 localhost podman[108096]: error opening file `/run/crun/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb/status`: No such file or directory Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.timer: No such file or directory Nov 23 04:01:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51060 DF PROTO=TCP SPT=50522 DPT=9101 SEQ=2870176415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B55FA00000000001030307) Nov 23 04:01:02 localhost systemd[1]: 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: Failed to open /run/systemd/transient/6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb.service: No such file or directory Nov 23 04:01:02 localhost podman[108085]: 2025-11-23 09:01:02.897157279 +0000 UTC m=+0.085632022 container cleanup 6ac7fb077afda0c58848027d81bec201797abfa8662f98a3624f4d5dcf83a7eb (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 23 04:01:02 localhost podman[108085]: collectd Nov 23 04:01:02 localhost systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'. Nov 23 04:01:02 localhost systemd[1]: Stopped collectd container. Nov 23 04:01:03 localhost python3.9[108192]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:03 localhost systemd[1]: var-lib-containers-storage-overlay-c10c17546222fa25908fa407b62ea6cd65af6052be72cbf7a83d74560c0146ad-merged.mount: Deactivated successfully. Nov 23 04:01:03 localhost systemd[1]: Reloading. Nov 23 04:01:03 localhost systemd-rc-local-generator[108216]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:03 localhost systemd-sysv-generator[108220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:03 localhost systemd[1]: Stopping iscsid container... Nov 23 04:01:04 localhost systemd[1]: libpod-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Deactivated successfully. Nov 23 04:01:04 localhost systemd[1]: libpod-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Consumed 1.095s CPU time. Nov 23 04:01:04 localhost podman[108232]: 2025-11-23 09:01:04.063393518 +0000 UTC m=+0.079843613 container died 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Deactivated successfully. Nov 23 04:01:04 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a. Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory Nov 23 04:01:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:04 localhost podman[108232]: 2025-11-23 09:01:04.097878779 +0000 UTC m=+0.114328884 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git) Nov 23 04:01:04 localhost podman[108232]: iscsid Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: No such file or directory Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory Nov 23 04:01:04 localhost podman[108244]: 2025-11-23 09:01:04.130779752 +0000 UTC m=+0.058277202 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12) Nov 23 04:01:04 localhost systemd[1]: libpod-conmon-85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.scope: Deactivated successfully. Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.timer: No such file or directory Nov 23 04:01:04 localhost systemd[1]: 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: Failed to open /run/systemd/transient/85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a.service: No such file or directory Nov 23 04:01:04 localhost podman[108258]: 2025-11-23 09:01:04.215197545 +0000 UTC m=+0.052545624 container cleanup 85db100d0b918beb23dddd6c709432c18fcaba25213c1fd5cb17ce190843167a (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 04:01:04 localhost podman[108258]: iscsid Nov 23 04:01:04 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Nov 23 04:01:04 localhost systemd[1]: Stopped iscsid container. Nov 23 04:01:04 localhost systemd[1]: var-lib-containers-storage-overlay-b27f47826dfdd9f94283f3471bc6c8f7a332741b941e529e6c99a436b7305250-merged.mount: Deactivated successfully. Nov 23 04:01:04 localhost python3.9[108362]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:06 localhost systemd[1]: Reloading. Nov 23 04:01:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25369 DF PROTO=TCP SPT=39460 DPT=9100 SEQ=2134616091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B56C200000000001030307) Nov 23 04:01:06 localhost systemd-sysv-generator[108396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:06 localhost systemd-rc-local-generator[108393]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:06 localhost systemd[1]: Stopping logrotate_crond container... Nov 23 04:01:06 localhost systemd[1]: libpod-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope: Deactivated successfully. Nov 23 04:01:06 localhost podman[108404]: 2025-11-23 09:01:06.473119066 +0000 UTC m=+0.067590741 container died 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Deactivated successfully. Nov 23 04:01:06 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c. Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory Nov 23 04:01:06 localhost podman[108404]: 2025-11-23 09:01:06.528646971 +0000 UTC m=+0.123118556 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 23 04:01:06 localhost podman[108404]: logrotate_crond Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: No such file or directory Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory Nov 23 04:01:06 localhost podman[108416]: 2025-11-23 09:01:06.566067014 +0000 UTC m=+0.086687375 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1761123044, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron) Nov 23 04:01:06 localhost systemd[1]: libpod-conmon-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.scope: Deactivated successfully. Nov 23 04:01:06 localhost podman[108444]: error opening file `/run/crun/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c/status`: No such file or directory Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.timer: No such file or directory Nov 23 04:01:06 localhost systemd[1]: 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: Failed to open /run/systemd/transient/53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c.service: No such file or directory Nov 23 04:01:06 localhost podman[108433]: 2025-11-23 09:01:06.673290845 +0000 UTC m=+0.075641871 container cleanup 53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z) Nov 23 04:01:06 localhost podman[108433]: logrotate_crond Nov 23 04:01:06 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Nov 23 04:01:06 localhost systemd[1]: Stopped logrotate_crond container. Nov 23 04:01:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 04:01:07 localhost podman[108539]: 2025-11-23 09:01:07.261018708 +0000 UTC m=+0.060039256 container health_status e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:01:07 localhost podman[108539]: 2025-11-23 09:01:07.288281336 +0000 UTC m=+0.087301954 container exec_died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 23 04:01:07 localhost podman[108539]: unhealthy Nov 23 04:01:07 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:07 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 04:01:07 localhost systemd[1]: var-lib-containers-storage-overlay-503e987379308b9e6b9946670c4ac6382bcf032235dd53dd51b9045ac75aedc7-merged.mount: Deactivated successfully. Nov 23 04:01:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53fb16e3865883b67951ac3af2ba91e7825740021ac0eb878ee07b861a3d8a6c-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:07 localhost python3.9[108538]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:07 localhost systemd[1]: Reloading. Nov 23 04:01:07 localhost systemd-sysv-generator[108588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:07 localhost systemd-rc-local-generator[108584]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:07 localhost systemd[1]: Stopping metrics_qdr container... Nov 23 04:01:07 localhost kernel: qdrouterd[54470]: segfault at 0 ip 00007fe453f4d7cb sp 00007fff8ee61160 error 4 in libc.so.6[7fe453eea000+175000] Nov 23 04:01:07 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Nov 23 04:01:07 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Nov 23 04:01:07 localhost systemd[1]: Started Process Core Dump (PID 108610/UID 0). Nov 23 04:01:08 localhost systemd-coredump[108611]: Resource limits disable core dumping for process 54470 (qdrouterd). Nov 23 04:01:08 localhost systemd-coredump[108611]: Process 54470 (qdrouterd) of user 42465 dumped core. Nov 23 04:01:08 localhost systemd[1]: systemd-coredump@0-108610-0.service: Deactivated successfully. Nov 23 04:01:08 localhost podman[108598]: 2025-11-23 09:01:08.090883645 +0000 UTC m=+0.240404651 container died 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 23 04:01:08 localhost systemd[1]: libpod-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Deactivated successfully. Nov 23 04:01:08 localhost systemd[1]: libpod-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Consumed 27.405s CPU time. Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Deactivated successfully. Nov 23 04:01:08 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2. Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory Nov 23 04:01:08 localhost podman[108598]: 2025-11-23 09:01:08.141765497 +0000 UTC m=+0.291286473 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public) Nov 23 04:01:08 localhost podman[108598]: metrics_qdr Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: No such file or directory Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory Nov 23 04:01:08 localhost podman[108615]: 2025-11-23 09:01:08.163111829 +0000 UTC m=+0.061783390 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:01:08 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Nov 23 04:01:08 localhost systemd[1]: libpod-conmon-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.scope: Deactivated successfully. Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.timer: No such file or directory Nov 23 04:01:08 localhost systemd[1]: 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: Failed to open /run/systemd/transient/019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2.service: No such file or directory Nov 23 04:01:08 localhost podman[108627]: 2025-11-23 09:01:08.265304955 +0000 UTC m=+0.067927302 container cleanup 019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, release=1761123044, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '64da22351939caf7431a331d2f0c888a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 23 04:01:08 localhost podman[108627]: metrics_qdr Nov 23 04:01:08 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Nov 23 04:01:08 localhost systemd[1]: Stopped metrics_qdr container. Nov 23 04:01:08 localhost systemd[1]: var-lib-containers-storage-overlay-e71eeb53ae033c058204425a46efa30ecb751cf5037dd51b11ece79b90149ba3-merged.mount: Deactivated successfully. Nov 23 04:01:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-019c27ab88838363c8468b3bf86ff69cfb0712fe62564f51dcb6c611915a66e2-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:09 localhost python3.9[108731]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47292 DF PROTO=TCP SPT=53018 DPT=9100 SEQ=1986790061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B578600000000001030307) Nov 23 04:01:09 localhost python3.9[108824]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 04:01:10 localhost podman[108872]: 2025-11-23 09:01:10.027150242 +0000 UTC m=+0.070472152 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z) Nov 23 04:01:10 localhost podman[108872]: 2025-11-23 09:01:10.424190879 +0000 UTC m=+0.467512739 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container) Nov 23 04:01:10 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 04:01:10 localhost python3.9[108938]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:11 localhost python3.9[109032]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:11 localhost systemd[1]: Reloading. Nov 23 04:01:11 localhost systemd-rc-local-generator[109060]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:11 localhost systemd-sysv-generator[109064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:11 localhost systemd[1]: Stopping nova_compute container... Nov 23 04:01:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63102 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B586A10000000001030307) Nov 23 04:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:01:13 localhost systemd[1]: tmp-crun.eHVT1m.mount: Deactivated successfully. Nov 23 04:01:13 localhost podman[109085]: 2025-11-23 09:01:13.026143631 +0000 UTC m=+0.081967478 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 23 04:01:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:01:13 localhost podman[109085]: 2025-11-23 09:01:13.05249083 +0000 UTC m=+0.108314647 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 04:01:13 localhost podman[109085]: unhealthy Nov 23 04:01:13 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:13 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:01:13 localhost podman[109105]: 2025-11-23 09:01:13.139013628 +0000 UTC m=+0.084947870 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, version=17.1.12, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:13 localhost podman[109105]: 2025-11-23 09:01:13.161518657 +0000 UTC m=+0.107452949 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:01:13 localhost podman[109105]: unhealthy Nov 23 04:01:13 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:13 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:01:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63103 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B58EA10000000001030307) Nov 23 04:01:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63104 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B59E600000000001030307) Nov 23 04:01:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1500 DF PROTO=TCP SPT=36602 DPT=9102 SEQ=2053157803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5AFE10000000001030307) Nov 23 04:01:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63105 DF PROTO=TCP SPT=40282 DPT=9882 SEQ=787887118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5BE200000000001030307) Nov 23 04:01:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26952 DF PROTO=TCP SPT=37260 DPT=9101 SEQ=742382203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5C8BA0000000001030307) Nov 23 04:01:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37594 DF PROTO=TCP SPT=52720 DPT=9105 SEQ=4006679888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5C96B0000000001030307) Nov 23 04:01:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26954 DF PROTO=TCP SPT=37260 DPT=9101 SEQ=742382203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5D4E10000000001030307) Nov 23 04:01:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3893 DF PROTO=TCP SPT=48150 DPT=9100 SEQ=3852951719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5E2200000000001030307) Nov 23 04:01:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 04:01:37 localhost podman[109127]: Error: container e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce is not running Nov 23 04:01:37 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Main process exited, code=exited, status=125/n/a Nov 23 04:01:37 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed with result 'exit-code'. Nov 23 04:01:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5753 DF PROTO=TCP SPT=39194 DPT=9100 SEQ=359370144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5EDA00000000001030307) Nov 23 04:01:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 04:01:41 localhost systemd[1]: tmp-crun.y0IJaQ.mount: Deactivated successfully. Nov 23 04:01:41 localhost podman[109140]: 2025-11-23 09:01:41.020411024 +0000 UTC m=+0.078132389 container health_status e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 04:01:41 localhost podman[109140]: 2025-11-23 09:01:41.379232734 +0000 UTC m=+0.436954039 container exec_died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044) Nov 23 04:01:41 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Deactivated successfully. Nov 23 04:01:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8215 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B5FBE00000000001030307) Nov 23 04:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:01:43 localhost podman[109240]: 2025-11-23 09:01:43.283785155 +0000 UTC m=+0.081536474 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64) Nov 23 04:01:43 localhost podman[109240]: 2025-11-23 09:01:43.298049598 +0000 UTC m=+0.095800927 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 04:01:43 localhost podman[109240]: unhealthy Nov 23 04:01:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:01:43 localhost podman[109239]: 2025-11-23 09:01:43.388913583 +0000 UTC m=+0.189693466 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible) Nov 23 04:01:43 localhost podman[109239]: 2025-11-23 09:01:43.405168217 +0000 UTC m=+0.205948130 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:43 localhost podman[109239]: unhealthy Nov 23 04:01:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:01:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:01:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8216 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B603E00000000001030307) Nov 23 04:01:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1502 DF PROTO=TCP SPT=36602 DPT=9102 SEQ=2053157803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B610200000000001030307) Nov 23 04:01:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38711 DF PROTO=TCP SPT=52154 DPT=9102 SEQ=4065124393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B624E00000000001030307) Nov 23 04:01:53 localhost podman[109072]: time="2025-11-23T09:01:53Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Nov 23 04:01:53 localhost systemd[1]: tmp-crun.CnmSQH.mount: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: session-c11.scope: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: libpod-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: libpod-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Consumed 35.074s CPU time. Nov 23 04:01:53 localhost podman[109072]: 2025-11-23 09:01:53.617773977 +0000 UTC m=+42.108239073 container died e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public) Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce. Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory Nov 23 04:01:53 localhost systemd[1]: tmp-crun.6IpaAq.mount: Deactivated successfully. Nov 23 04:01:53 localhost podman[109072]: 2025-11-23 09:01:53.71634472 +0000 UTC m=+42.206809816 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, architecture=x86_64, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 23 04:01:53 localhost podman[109072]: nova_compute Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: No such file or directory Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory Nov 23 04:01:53 localhost podman[109280]: 2025-11-23 09:01:53.756958662 +0000 UTC m=+0.123005343 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 23 04:01:53 localhost systemd[1]: libpod-conmon-e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.scope: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.timer: No such file or directory Nov 23 04:01:53 localhost systemd[1]: e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: Failed to open /run/systemd/transient/e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce.service: No such file or directory Nov 23 04:01:53 localhost podman[109295]: 2025-11-23 09:01:53.833293454 +0000 UTC m=+0.048444636 container cleanup e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_id=tripleo_step5, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git) Nov 23 04:01:53 localhost podman[109295]: nova_compute Nov 23 04:01:53 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Nov 23 04:01:53 localhost systemd[1]: Stopped nova_compute container. Nov 23 04:01:53 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.088s CPU time, no IO. Nov 23 04:01:54 localhost systemd[1]: var-lib-containers-storage-overlay-e8ecc5a7d0819463bef271e161d4bdd609525e6c41d5f8ce9ddcd57558c51829-merged.mount: Deactivated successfully. Nov 23 04:01:54 localhost python3.9[109398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:54 localhost systemd[1]: Reloading. Nov 23 04:01:54 localhost systemd-sysv-generator[109429]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:54 localhost systemd-rc-local-generator[109425]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:55 localhost systemd[1]: Stopping nova_migration_target container... Nov 23 04:01:55 localhost systemd[1]: libpod-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Deactivated successfully. Nov 23 04:01:55 localhost systemd[1]: libpod-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Consumed 34.442s CPU time. Nov 23 04:01:55 localhost podman[109439]: 2025-11-23 09:01:55.089667284 +0000 UTC m=+0.078474550 container died e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target) Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Deactivated successfully. Nov 23 04:01:55 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7. Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory Nov 23 04:01:55 localhost systemd[1]: tmp-crun.Td4eE0.mount: Deactivated successfully. Nov 23 04:01:55 localhost podman[109439]: 2025-11-23 09:01:55.146634434 +0000 UTC m=+0.135441680 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target) Nov 23 04:01:55 localhost podman[109439]: nova_migration_target Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: No such file or directory Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory Nov 23 04:01:55 localhost podman[109451]: 2025-11-23 09:01:55.172590861 +0000 UTC m=+0.072076691 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:55 localhost systemd[1]: libpod-conmon-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.scope: Deactivated successfully. Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.timer: No such file or directory Nov 23 04:01:55 localhost systemd[1]: e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: Failed to open /run/systemd/transient/e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7.service: No such file or directory Nov 23 04:01:55 localhost podman[109463]: 2025-11-23 09:01:55.264356032 +0000 UTC m=+0.065752484 container cleanup e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Nov 23 04:01:55 localhost podman[109463]: nova_migration_target Nov 23 04:01:55 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Nov 23 04:01:55 localhost systemd[1]: Stopped nova_migration_target container. Nov 23 04:01:55 localhost systemd[1]: var-lib-containers-storage-overlay-5ec9c7891f4ca72bdb5effe2aebfd354dedf1d89abf9e4437e67303ad6ef96e5-merged.mount: Deactivated successfully. Nov 23 04:01:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8c7572c79b0c7ac28c8b872c53b8daf0e19788ce1b6afbd2047787a520b03a7-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:56 localhost python3.9[109566]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:01:56 localhost systemd[1]: Reloading. Nov 23 04:01:56 localhost systemd-rc-local-generator[109594]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:01:56 localhost systemd-sysv-generator[109598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:01:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:01:56 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Nov 23 04:01:56 localhost systemd[1]: libpod-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope: Deactivated successfully. Nov 23 04:01:56 localhost podman[109606]: 2025-11-23 09:01:56.487454748 +0000 UTC m=+0.062802162 container died 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:01:56 localhost podman[109606]: 2025-11-23 09:01:56.528087351 +0000 UTC m=+0.103434745 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 23 04:01:56 localhost podman[109606]: nova_virtlogd_wrapper Nov 23 04:01:56 localhost podman[109620]: 2025-11-23 09:01:56.550952131 +0000 UTC m=+0.052471502 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 04:01:56 localhost systemd[1]: tmp-crun.A4xUYe.mount: Deactivated successfully. Nov 23 04:01:56 localhost systemd[1]: var-lib-containers-storage-overlay-43fdb0bbe8f4b890331649f0e6375660713274fad2f53aa9281f6430b3f23b11-merged.mount: Deactivated successfully. Nov 23 04:01:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11-userdata-shm.mount: Deactivated successfully. Nov 23 04:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8218 DF PROTO=TCP SPT=55658 DPT=9882 SEQ=874165632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B634200000000001030307) Nov 23 04:01:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14724 DF PROTO=TCP SPT=53910 DPT=9101 SEQ=3315328708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B63DEA0000000001030307) Nov 23 04:01:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19195 DF PROTO=TCP SPT=34468 DPT=9105 SEQ=346067449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B63E9B0000000001030307) Nov 23 04:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:02:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19197 DF PROTO=TCP SPT=34468 DPT=9105 SEQ=346067449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B64AA10000000001030307) Nov 23 04:02:03 localhost systemd[1]: Stopping User Manager for UID 0... Nov 23 04:02:03 localhost systemd[84232]: Activating special unit Exit the Session... Nov 23 04:02:03 localhost systemd[84232]: Removed slice User Background Tasks Slice. Nov 23 04:02:03 localhost systemd[84232]: Stopped target Main User Target. Nov 23 04:02:03 localhost systemd[84232]: Stopped target Basic System. Nov 23 04:02:03 localhost systemd[84232]: Stopped target Paths. Nov 23 04:02:03 localhost systemd[84232]: Stopped target Sockets. Nov 23 04:02:03 localhost systemd[84232]: Stopped target Timers. Nov 23 04:02:03 localhost systemd[84232]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 04:02:03 localhost systemd[84232]: Closed D-Bus User Message Bus Socket. Nov 23 04:02:03 localhost systemd[84232]: Stopped Create User's Volatile Files and Directories. Nov 23 04:02:03 localhost systemd[84232]: Removed slice User Application Slice. Nov 23 04:02:03 localhost systemd[84232]: Reached target Shutdown. Nov 23 04:02:03 localhost systemd[84232]: Finished Exit the Session. Nov 23 04:02:03 localhost systemd[84232]: Reached target Exit the Session. Nov 23 04:02:03 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 23 04:02:03 localhost systemd[1]: Stopped User Manager for UID 0. Nov 23 04:02:03 localhost systemd[1]: user@0.service: Consumed 4.298s CPU time, no IO. Nov 23 04:02:03 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 23 04:02:03 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 23 04:02:03 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 23 04:02:03 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 23 04:02:03 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 23 04:02:03 localhost systemd[1]: user-0.slice: Consumed 5.280s CPU time. Nov 23 04:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:02:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47295 DF PROTO=TCP SPT=53018 DPT=9100 SEQ=1986790061 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B656200000000001030307) Nov 23 04:02:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=840 DF PROTO=TCP SPT=58022 DPT=9100 SEQ=3884394366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B662E00000000001030307) Nov 23 04:02:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35584 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B671210000000001030307) Nov 23 04:02:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:02:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:02:13 localhost podman[109640]: 2025-11-23 09:02:13.533529915 +0000 UTC m=+0.085813056 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4) Nov 23 04:02:13 localhost podman[109640]: 2025-11-23 09:02:13.574970453 +0000 UTC m=+0.127253564 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container) Nov 23 04:02:13 localhost podman[109640]: unhealthy Nov 23 04:02:13 localhost systemd[1]: tmp-crun.BZ5Zpd.mount: Deactivated successfully. Nov 23 04:02:13 localhost podman[109639]: 2025-11-23 09:02:13.596347978 +0000 UTC m=+0.145820012 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044) Nov 23 04:02:13 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:02:13 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:02:13 localhost podman[109639]: 2025-11-23 09:02:13.619437625 +0000 UTC m=+0.168909649 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent) Nov 23 04:02:13 localhost podman[109639]: unhealthy Nov 23 04:02:13 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:02:13 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:02:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35585 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B679200000000001030307) Nov 23 04:02:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38713 DF PROTO=TCP SPT=52154 DPT=9102 SEQ=4065124393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B686200000000001030307) Nov 23 04:02:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58598 DF PROTO=TCP SPT=51150 DPT=9102 SEQ=102310858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B69A200000000001030307) Nov 23 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35587 DF PROTO=TCP SPT=48126 DPT=9882 SEQ=854778408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6AA200000000001030307) Nov 23 04:02:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26604 DF PROTO=TCP SPT=56220 DPT=9101 SEQ=3970731790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6B31A0000000001030307) Nov 23 04:02:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45399 DF PROTO=TCP SPT=38422 DPT=9105 SEQ=592499428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6B3CC0000000001030307) Nov 23 04:02:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26606 DF PROTO=TCP SPT=56220 DPT=9101 SEQ=3970731790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6BF200000000001030307) Nov 23 04:02:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5756 DF PROTO=TCP SPT=39194 DPT=9100 SEQ=359370144 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6CC200000000001030307) Nov 23 04:02:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49384 DF PROTO=TCP SPT=46730 DPT=9100 SEQ=505397214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6D8200000000001030307) Nov 23 04:02:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53432 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6E6600000000001030307) Nov 23 04:02:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:02:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:02:43 localhost podman[109715]: 2025-11-23 09:02:43.757111666 +0000 UTC m=+0.060453920 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:34:05Z, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 23 04:02:43 localhost podman[109715]: 2025-11-23 09:02:43.799800773 +0000 UTC m=+0.103143037 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Nov 23 04:02:43 localhost podman[109715]: unhealthy Nov 23 04:02:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:02:43 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:02:43 localhost podman[109709]: 2025-11-23 09:02:43.81097223 +0000 UTC m=+0.114223031 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc.) Nov 23 04:02:43 localhost podman[109709]: 2025-11-23 09:02:43.843101219 +0000 UTC m=+0.146352000 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 23 04:02:43 localhost podman[109709]: unhealthy Nov 23 04:02:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:02:43 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:02:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53433 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6EE600000000001030307) Nov 23 04:02:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58600 DF PROTO=TCP SPT=51150 DPT=9102 SEQ=102310858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B6FA210000000001030307) Nov 23 04:02:51 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 23 04:02:51 localhost recover_tripleo_nova_virtqemud[109795]: 61756 Nov 23 04:02:51 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 23 04:02:51 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 23 04:02:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22741 DF PROTO=TCP SPT=59924 DPT=9102 SEQ=3480946583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B70F600000000001030307) Nov 23 04:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53435 DF PROTO=TCP SPT=60278 DPT=9882 SEQ=2084944691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B71E200000000001030307) Nov 23 04:02:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24647 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=3931588753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7284A0000000001030307) Nov 23 04:02:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2008 DF PROTO=TCP SPT=50106 DPT=9105 SEQ=2647242858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B728FB0000000001030307) Nov 23 04:03:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24649 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=3931588753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B734610000000001030307) Nov 23 04:03:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=843 DF PROTO=TCP SPT=58022 DPT=9100 SEQ=3884394366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B742200000000001030307) Nov 23 04:03:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23464 DF PROTO=TCP SPT=57104 DPT=9100 SEQ=3587821903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B74D210000000001030307) Nov 23 04:03:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58452 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B75B600000000001030307) Nov 23 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:03:14 localhost podman[109797]: 2025-11-23 09:03:14.031824784 +0000 UTC m=+0.085527040 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com) Nov 23 04:03:14 localhost podman[109796]: 2025-11-23 09:03:14.083012374 +0000 UTC m=+0.137732841 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:03:14 localhost podman[109797]: 2025-11-23 09:03:14.099957151 +0000 UTC m=+0.153659437 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 04:03:14 localhost podman[109797]: unhealthy Nov 23 04:03:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:03:14 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:03:14 localhost podman[109796]: 2025-11-23 09:03:14.119810727 +0000 UTC m=+0.174531154 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 04:03:14 localhost podman[109796]: unhealthy Nov 23 04:03:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:03:14 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:03:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58453 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B763600000000001030307) Nov 23 04:03:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22743 DF PROTO=TCP SPT=59924 DPT=9102 SEQ=3480946583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B770200000000001030307) Nov 23 04:03:20 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Nov 23 04:03:20 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60965 (conmon) with signal SIGKILL. Nov 23 04:03:20 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Nov 23 04:03:20 localhost systemd[1]: libpod-conmon-11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11.scope: Deactivated successfully. Nov 23 04:03:20 localhost systemd[1]: tmp-crun.pORbxB.mount: Deactivated successfully. Nov 23 04:03:20 localhost podman[109850]: error opening file `/run/crun/11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11/status`: No such file or directory Nov 23 04:03:20 localhost podman[109838]: 2025-11-23 09:03:20.76639882 +0000 UTC m=+0.075678812 container cleanup 11dfd1bba91fa3da488de5a1540b1d56b53c4f0cc2c05ba55b9518a273f93f11 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtlogd_wrapper, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 04:03:20 localhost podman[109838]: nova_virtlogd_wrapper Nov 23 04:03:20 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Nov 23 04:03:20 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Nov 23 04:03:21 localhost python3.9[109943]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:03:21 localhost systemd[1]: Reloading. Nov 23 04:03:21 localhost systemd-rc-local-generator[109972]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:03:21 localhost systemd-sysv-generator[109977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:03:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:03:21 localhost systemd[1]: Stopping nova_virtnodedevd container... Nov 23 04:03:21 localhost systemd[1]: libpod-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Deactivated successfully. Nov 23 04:03:21 localhost systemd[1]: libpod-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Consumed 1.471s CPU time. Nov 23 04:03:22 localhost podman[109985]: 2025-11-23 09:03:22.000997184 +0000 UTC m=+0.059594904 container died aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:35:22Z, version=17.1.12, architecture=x86_64) Nov 23 04:03:22 localhost systemd[1]: tmp-crun.n2Pgn8.mount: Deactivated successfully. Nov 23 04:03:22 localhost systemd[1]: tmp-crun.RuIsjj.mount: Deactivated successfully. Nov 23 04:03:22 localhost podman[109985]: 2025-11-23 09:03:22.055176557 +0000 UTC m=+0.113774197 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, container_name=nova_virtnodedevd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git) Nov 23 04:03:22 localhost podman[109985]: nova_virtnodedevd Nov 23 04:03:22 localhost podman[109999]: 2025-11-23 09:03:22.120257059 +0000 UTC m=+0.109399201 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible) Nov 23 04:03:22 localhost systemd[1]: libpod-conmon-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd.scope: Deactivated successfully. Nov 23 04:03:22 localhost podman[110026]: error opening file `/run/crun/aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd/status`: No such file or directory Nov 23 04:03:22 localhost podman[110014]: 2025-11-23 09:03:22.235381076 +0000 UTC m=+0.073035760 container cleanup aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, container_name=nova_virtnodedevd) Nov 23 04:03:22 localhost podman[110014]: nova_virtnodedevd Nov 23 04:03:22 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Nov 23 04:03:22 localhost systemd[1]: Stopped nova_virtnodedevd container. Nov 23 04:03:22 localhost sshd[110112]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:03:22 localhost python3.9[110121]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:03:22 localhost systemd[1]: var-lib-containers-storage-overlay-012c48980a9a49a2a6f751e4c00244a56c89b8a830a1bd4128fb5ac4dbbd88af-merged.mount: Deactivated successfully. Nov 23 04:03:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa514075339ead5eb9649ab739e3d07bb2d7eaa7d251ca83591cab330efd15cd-userdata-shm.mount: Deactivated successfully. Nov 23 04:03:23 localhost systemd[1]: Reloading. Nov 23 04:03:23 localhost systemd-rc-local-generator[110149]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:03:23 localhost systemd-sysv-generator[110152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:03:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:03:23 localhost systemd[1]: Stopping nova_virtproxyd container... Nov 23 04:03:23 localhost systemd[1]: libpod-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope: Deactivated successfully. Nov 23 04:03:23 localhost podman[110162]: 2025-11-23 09:03:23.426884281 +0000 UTC m=+0.070668848 container died 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 23 04:03:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50832 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=2341202835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B784A00000000001030307) Nov 23 04:03:23 localhost podman[110162]: 2025-11-23 09:03:23.470077532 +0000 UTC m=+0.113862049 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, version=17.1.12, distribution-scope=public, container_name=nova_virtproxyd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt) Nov 23 04:03:23 localhost podman[110162]: nova_virtproxyd Nov 23 04:03:23 localhost podman[110176]: 2025-11-23 09:03:23.499658812 +0000 UTC m=+0.063745542 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_virtproxyd, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 23 04:03:23 localhost systemd[1]: libpod-conmon-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08.scope: Deactivated successfully. Nov 23 04:03:23 localhost podman[110206]: error opening file `/run/crun/108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08/status`: No such file or directory Nov 23 04:03:23 localhost podman[110193]: 2025-11-23 09:03:23.582393512 +0000 UTC m=+0.057194187 container cleanup 108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, distribution-scope=public, container_name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 04:03:23 localhost podman[110193]: nova_virtproxyd Nov 23 04:03:23 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Nov 23 04:03:23 localhost systemd[1]: Stopped nova_virtproxyd container. Nov 23 04:03:23 localhost systemd[1]: var-lib-containers-storage-overlay-86f0ce13f8e1144b10a5187270100b5d709a634f0265cf8a09ce87bbb9ba0c0f-merged.mount: Deactivated successfully. Nov 23 04:03:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-108cc117ae846fff269db18426563547d516af8df2127d01b2007e7755cc0c08-userdata-shm.mount: Deactivated successfully. Nov 23 04:03:24 localhost python3.9[110299]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:03:24 localhost systemd[1]: Reloading. Nov 23 04:03:24 localhost systemd-rc-local-generator[110322]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:03:24 localhost systemd-sysv-generator[110327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:03:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:03:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Nov 23 04:03:24 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Nov 23 04:03:24 localhost systemd[1]: Stopping nova_virtqemud container... Nov 23 04:03:24 localhost systemd[1]: libpod-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Deactivated successfully. Nov 23 04:03:24 localhost systemd[1]: libpod-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Consumed 2.641s CPU time. Nov 23 04:03:24 localhost podman[110339]: 2025-11-23 09:03:24.765347061 +0000 UTC m=+0.080623216 container died 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=nova_virtqemud, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 23 04:03:24 localhost podman[110339]: 2025-11-23 09:03:24.805995234 +0000 UTC m=+0.121271349 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:03:24 localhost podman[110339]: nova_virtqemud Nov 23 04:03:24 localhost podman[110354]: 2025-11-23 09:03:24.85156151 +0000 UTC m=+0.072049320 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_virtqemud, release=1761123044, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64) Nov 23 04:03:24 localhost systemd[1]: tmp-crun.AsmWbu.mount: Deactivated successfully. Nov 23 04:03:24 localhost systemd[1]: var-lib-containers-storage-overlay-b0cd03eaeeede57d51f2132c267210dfc5ad126abc25a071fbf14be0df873e9b-merged.mount: Deactivated successfully. Nov 23 04:03:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8-userdata-shm.mount: Deactivated successfully. Nov 23 04:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58455 DF PROTO=TCP SPT=47578 DPT=9882 SEQ=4215290881 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B794210000000001030307) Nov 23 04:03:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51931 DF PROTO=TCP SPT=36278 DPT=9101 SEQ=2790590587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B79D7B0000000001030307) Nov 23 04:03:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33943 DF PROTO=TCP SPT=58484 DPT=9105 SEQ=1586031484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B79E2B0000000001030307) Nov 23 04:03:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51933 DF PROTO=TCP SPT=36278 DPT=9101 SEQ=2790590587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7A9A10000000001030307) Nov 23 04:03:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49387 DF PROTO=TCP SPT=46730 DPT=9100 SEQ=505397214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7B6210000000001030307) Nov 23 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59506 DF PROTO=TCP SPT=49440 DPT=9100 SEQ=3861658067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7C2600000000001030307) Nov 23 04:03:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27534 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7D0A00000000001030307) Nov 23 04:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:03:44 localhost podman[110371]: 2025-11-23 09:03:44.537001994 +0000 UTC m=+0.091159133 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:03:44 localhost systemd[1]: tmp-crun.bwRefP.mount: Deactivated successfully. Nov 23 04:03:44 localhost podman[110372]: 2025-11-23 09:03:44.600066134 +0000 UTC m=+0.152228222 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 04:03:44 localhost podman[110371]: 2025-11-23 09:03:44.603954684 +0000 UTC m=+0.158111863 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 04:03:44 localhost podman[110371]: unhealthy Nov 23 04:03:44 localhost podman[110372]: 2025-11-23 09:03:44.612786489 +0000 UTC m=+0.164948587 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 23 04:03:44 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:03:44 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:03:44 localhost podman[110372]: unhealthy Nov 23 04:03:44 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:03:44 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:03:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27535 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7D8A00000000001030307) Nov 23 04:03:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50834 DF PROTO=TCP SPT=37980 DPT=9102 SEQ=2341202835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7E4210000000001030307) Nov 23 04:03:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37698 DF PROTO=TCP SPT=43254 DPT=9102 SEQ=114999264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B7F9A00000000001030307) Nov 23 04:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27537 DF PROTO=TCP SPT=58332 DPT=9882 SEQ=4196375563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B808200000000001030307) Nov 23 04:03:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52983 DF PROTO=TCP SPT=33798 DPT=9101 SEQ=4110551325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B812A90000000001030307) Nov 23 04:03:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16327 DF PROTO=TCP SPT=36902 DPT=9105 SEQ=1191719041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8135B0000000001030307) Nov 23 04:04:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16329 DF PROTO=TCP SPT=36902 DPT=9105 SEQ=1191719041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B81F610000000001030307) Nov 23 04:04:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23467 DF PROTO=TCP SPT=57104 DPT=9100 SEQ=3587821903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B82C200000000001030307) Nov 23 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5977 DF PROTO=TCP SPT=51382 DPT=9100 SEQ=1610855476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B837A10000000001030307) Nov 23 04:04:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32097 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B845E10000000001030307) Nov 23 04:04:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:04:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:04:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32098 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B84DE00000000001030307) Nov 23 04:04:15 localhost systemd[1]: tmp-crun.Nh8WFz.mount: Deactivated successfully. Nov 23 04:04:15 localhost podman[110487]: 2025-11-23 09:04:15.036717606 +0000 UTC m=+0.092474075 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public) Nov 23 04:04:15 localhost podman[110487]: 2025-11-23 09:04:15.051246517 +0000 UTC m=+0.107002946 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 23 04:04:15 localhost podman[110487]: unhealthy Nov 23 04:04:15 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:04:15 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:04:15 localhost podman[110488]: 2025-11-23 09:04:15.132859343 +0000 UTC m=+0.187507587 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4) Nov 23 04:04:15 localhost podman[110488]: 2025-11-23 09:04:15.176288302 +0000 UTC m=+0.230936536 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, version=17.1.12, container_name=ovn_controller) Nov 23 04:04:15 localhost podman[110488]: unhealthy Nov 23 04:04:15 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:04:15 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:04:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37700 DF PROTO=TCP SPT=43254 DPT=9102 SEQ=114999264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B85A210000000001030307) Nov 23 04:04:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12188 DF PROTO=TCP SPT=56828 DPT=9102 SEQ=235745335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B86EE00000000001030307) Nov 23 04:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32100 DF PROTO=TCP SPT=48962 DPT=9882 SEQ=3463408620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B87E200000000001030307) Nov 23 04:04:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62713 DF PROTO=TCP SPT=57822 DPT=9101 SEQ=3255914638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B887DA0000000001030307) Nov 23 04:04:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45979 DF PROTO=TCP SPT=33364 DPT=9105 SEQ=1400042007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8888E0000000001030307) Nov 23 04:04:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62715 DF PROTO=TCP SPT=57822 DPT=9101 SEQ=3255914638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B893E00000000001030307) Nov 23 04:04:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59509 DF PROTO=TCP SPT=49440 DPT=9100 SEQ=3861658067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8A0200000000001030307) Nov 23 04:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39162 DF PROTO=TCP SPT=60912 DPT=9100 SEQ=4066548656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8ACE10000000001030307) Nov 23 04:04:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18765 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8BB200000000001030307) Nov 23 04:04:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18766 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8C3200000000001030307) Nov 23 04:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:04:45 localhost systemd[1]: tmp-crun.AF0ZrZ.mount: Deactivated successfully. Nov 23 04:04:45 localhost podman[110526]: 2025-11-23 09:04:45.267648882 +0000 UTC m=+0.077004124 container health_status 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 04:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:04:45 localhost podman[110526]: 2025-11-23 09:04:45.306980994 +0000 UTC m=+0.116336236 container exec_died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public) Nov 23 04:04:45 localhost podman[110526]: unhealthy Nov 23 04:04:45 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:04:45 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed with result 'exit-code'. Nov 23 04:04:45 localhost systemd[1]: tmp-crun.Gf8B7r.mount: Deactivated successfully. Nov 23 04:04:45 localhost podman[110545]: 2025-11-23 09:04:45.380974553 +0000 UTC m=+0.087839880 container health_status 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 23 04:04:45 localhost podman[110545]: 2025-11-23 09:04:45.396163976 +0000 UTC m=+0.103029263 container exec_died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, distribution-scope=public, vcs-type=git) Nov 23 04:04:45 localhost podman[110545]: unhealthy Nov 23 04:04:45 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:04:45 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed with result 'exit-code'. Nov 23 04:04:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12190 DF PROTO=TCP SPT=56828 DPT=9102 SEQ=235745335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8D0200000000001030307) Nov 23 04:04:48 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Nov 23 04:04:48 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61752 (conmon) with signal SIGKILL. Nov 23 04:04:48 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Nov 23 04:04:48 localhost systemd[1]: libpod-conmon-80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8.scope: Deactivated successfully. Nov 23 04:04:48 localhost podman[110624]: error opening file `/run/crun/80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8/status`: No such file or directory Nov 23 04:04:48 localhost systemd[1]: tmp-crun.cgSfKs.mount: Deactivated successfully. Nov 23 04:04:48 localhost podman[110611]: 2025-11-23 09:04:48.915718349 +0000 UTC m=+0.066325862 container cleanup 80e6b661c7c3dce8a0f643cfae7f0c6bad238eaad40a6059397b00c72f7835b8 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_virtqemud, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 23 04:04:48 localhost podman[110611]: nova_virtqemud Nov 23 04:04:48 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Nov 23 04:04:48 localhost systemd[1]: Stopped nova_virtqemud container. Nov 23 04:04:49 localhost python3.9[110747]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:04:49 localhost systemd[1]: Reloading. Nov 23 04:04:49 localhost systemd-rc-local-generator[110806]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:04:49 localhost systemd-sysv-generator[110809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:04:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:04:50 localhost python3.9[110924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:04:50 localhost systemd[1]: Reloading. Nov 23 04:04:50 localhost systemd-sysv-generator[110956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:04:50 localhost systemd-rc-local-generator[110951]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:04:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:04:51 localhost systemd[1]: Stopping nova_virtsecretd container... Nov 23 04:04:51 localhost systemd[1]: tmp-crun.ECVk4b.mount: Deactivated successfully. Nov 23 04:04:51 localhost systemd[1]: libpod-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope: Deactivated successfully. Nov 23 04:04:51 localhost podman[110965]: 2025-11-23 09:04:51.29303389 +0000 UTC m=+0.091205125 container died a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 04:04:51 localhost podman[110965]: 2025-11-23 09:04:51.330417011 +0000 UTC m=+0.128588196 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team) Nov 23 04:04:51 localhost podman[110965]: nova_virtsecretd Nov 23 04:04:51 localhost podman[110978]: 2025-11-23 09:04:51.369562338 +0000 UTC m=+0.066429226 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtsecretd, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 23 04:04:51 localhost systemd[1]: libpod-conmon-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03.scope: Deactivated successfully. Nov 23 04:04:51 localhost podman[111009]: error opening file `/run/crun/a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03/status`: No such file or directory Nov 23 04:04:51 localhost podman[110997]: 2025-11-23 09:04:51.456401136 +0000 UTC m=+0.056760914 container cleanup a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=) Nov 23 04:04:51 localhost podman[110997]: nova_virtsecretd Nov 23 04:04:51 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Nov 23 04:04:51 localhost systemd[1]: Stopped nova_virtsecretd container. Nov 23 04:04:52 localhost python3.9[111102]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:04:52 localhost systemd[1]: var-lib-containers-storage-overlay-1c731908b6b1b73a8dce8f968a95e48085bbe8c6a8e98c6914fe1adabda124bb-merged.mount: Deactivated successfully. Nov 23 04:04:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a38c0f668815c18de1f5d1022132d34515ca6abeef2e4fb424d46b6093fc9d03-userdata-shm.mount: Deactivated successfully. Nov 23 04:04:52 localhost systemd[1]: Reloading. Nov 23 04:04:52 localhost systemd-rc-local-generator[111125]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:04:52 localhost systemd-sysv-generator[111129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:04:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:04:52 localhost systemd[1]: Stopping nova_virtstoraged container... Nov 23 04:04:52 localhost systemd[1]: libpod-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope: Deactivated successfully. Nov 23 04:04:52 localhost podman[111143]: 2025-11-23 09:04:52.687439039 +0000 UTC m=+0.075707794 container died 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtstoraged, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git) Nov 23 04:04:52 localhost podman[111143]: 2025-11-23 09:04:52.72094689 +0000 UTC m=+0.109215605 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_virtstoraged, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:04:52 localhost podman[111143]: nova_virtstoraged Nov 23 04:04:52 localhost podman[111158]: 2025-11-23 09:04:52.766588838 +0000 UTC m=+0.068536831 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 23 04:04:52 localhost systemd[1]: libpod-conmon-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a.scope: Deactivated successfully. Nov 23 04:04:52 localhost podman[111186]: error opening file `/run/crun/33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a/status`: No such file or directory Nov 23 04:04:52 localhost podman[111175]: 2025-11-23 09:04:52.846413028 +0000 UTC m=+0.048025253 container cleanup 33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '39370c45b6a27bfda1ebe1fb9d328c43'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container) Nov 23 04:04:52 localhost podman[111175]: nova_virtstoraged Nov 23 04:04:52 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Nov 23 04:04:52 localhost systemd[1]: Stopped nova_virtstoraged container. Nov 23 04:04:53 localhost systemd[1]: var-lib-containers-storage-overlay-28adb83c43b938ee318ecfcf4c823434bcec89ec4cf241474165a3b8eaf18e89-merged.mount: Deactivated successfully. Nov 23 04:04:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33c8f3ea1abcb3098a5b2509a196bff840d175f289416b71adbf39bf3ee9b67a-userdata-shm.mount: Deactivated successfully. Nov 23 04:04:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46878 DF PROTO=TCP SPT=48896 DPT=9102 SEQ=188186518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8E4200000000001030307) Nov 23 04:04:53 localhost python3.9[111280]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:04:53 localhost systemd[1]: Reloading. Nov 23 04:04:53 localhost systemd-rc-local-generator[111306]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:04:53 localhost systemd-sysv-generator[111312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:04:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:04:53 localhost systemd[1]: Stopping ovn_controller container... Nov 23 04:04:54 localhost systemd[1]: libpod-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Deactivated successfully. Nov 23 04:04:54 localhost systemd[1]: libpod-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Consumed 2.507s CPU time. Nov 23 04:04:54 localhost podman[111321]: 2025-11-23 09:04:54.02305407 +0000 UTC m=+0.076316572 container died 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller) Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Deactivated successfully. Nov 23 04:04:54 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23. Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory Nov 23 04:04:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23-userdata-shm.mount: Deactivated successfully. Nov 23 04:04:54 localhost podman[111321]: 2025-11-23 09:04:54.062346571 +0000 UTC m=+0.115609033 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Nov 23 04:04:54 localhost podman[111321]: ovn_controller Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: No such file or directory Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory Nov 23 04:04:54 localhost podman[111335]: 2025-11-23 09:04:54.099563108 +0000 UTC m=+0.065057923 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true) Nov 23 04:04:54 localhost systemd[1]: libpod-conmon-99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.scope: Deactivated successfully. Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.timer: No such file or directory Nov 23 04:04:54 localhost systemd[1]: 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: Failed to open /run/systemd/transient/99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23.service: No such file or directory Nov 23 04:04:54 localhost podman[111349]: 2025-11-23 09:04:54.188849092 +0000 UTC m=+0.049625592 container cleanup 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 23 04:04:54 localhost podman[111349]: ovn_controller Nov 23 04:04:54 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Nov 23 04:04:54 localhost systemd[1]: Stopped ovn_controller container. Nov 23 04:04:54 localhost systemd[1]: var-lib-containers-storage-overlay-c0462c731844bb59d1ec529f77837ece08511a3108ad760cbddff4f0512d4199-merged.mount: Deactivated successfully. Nov 23 04:04:54 localhost python3.9[111453]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:04:54 localhost systemd[1]: Reloading. Nov 23 04:04:55 localhost systemd-rc-local-generator[111476]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:04:55 localhost systemd-sysv-generator[111479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:04:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:04:55 localhost systemd[1]: Stopping ovn_metadata_agent container... Nov 23 04:04:56 localhost systemd[1]: libpod-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Deactivated successfully. Nov 23 04:04:56 localhost systemd[1]: libpod-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Consumed 10.886s CPU time. Nov 23 04:04:56 localhost podman[111494]: 2025-11-23 09:04:56.018530037 +0000 UTC m=+0.732985618 container died 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:04:56 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Deactivated successfully. Nov 23 04:04:56 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f. Nov 23 04:04:56 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory Nov 23 04:04:56 localhost systemd[1]: tmp-crun.OfKjgQ.mount: Deactivated successfully. Nov 23 04:04:56 localhost podman[111494]: 2025-11-23 09:04:56.09332991 +0000 UTC m=+0.807785441 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 04:04:56 localhost podman[111494]: ovn_metadata_agent Nov 23 04:04:56 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: No such file or directory Nov 23 04:04:56 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory Nov 23 04:04:56 localhost podman[111506]: 2025-11-23 09:04:56.152775508 +0000 UTC m=+0.124184730 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_id=tripleo_step4) Nov 23 04:04:56 localhost systemd[1]: var-lib-containers-storage-overlay-5531cd55c1eaaae58642503ef766c6bc4c165d2df9c8d9a0b2f16cdd36d9c0e9-merged.mount: Deactivated successfully. Nov 23 04:04:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f-userdata-shm.mount: Deactivated successfully. Nov 23 04:04:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18768 DF PROTO=TCP SPT=51360 DPT=9882 SEQ=4264641768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8F4200000000001030307) Nov 23 04:04:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45133 DF PROTO=TCP SPT=59240 DPT=9101 SEQ=2782383458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8FD0B0000000001030307) Nov 23 04:04:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46737 DF PROTO=TCP SPT=60328 DPT=9105 SEQ=1388543733 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B8FDBB0000000001030307) Nov 23 04:05:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45135 DF PROTO=TCP SPT=59240 DPT=9101 SEQ=2782383458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B909210000000001030307) Nov 23 04:05:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5980 DF PROTO=TCP SPT=51382 DPT=9100 SEQ=1610855476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B916210000000001030307) Nov 23 04:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8883 DF PROTO=TCP SPT=44354 DPT=9100 SEQ=438120719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B921E10000000001030307) Nov 23 04:05:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49268 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B930200000000001030307) Nov 23 04:05:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49269 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B938210000000001030307) Nov 23 04:05:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46880 DF PROTO=TCP SPT=48896 DPT=9102 SEQ=188186518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B944200000000001030307) Nov 23 04:05:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36866 DF PROTO=TCP SPT=40230 DPT=9102 SEQ=3651957829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B959600000000001030307) Nov 23 04:05:24 localhost sshd[111524]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49271 DF PROTO=TCP SPT=43182 DPT=9882 SEQ=4264056996 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B968200000000001030307) Nov 23 04:05:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19635 DF PROTO=TCP SPT=43478 DPT=9101 SEQ=3238666304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9723A0000000001030307) Nov 23 04:05:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31075 DF PROTO=TCP SPT=50952 DPT=9105 SEQ=3111224148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B972EB0000000001030307) Nov 23 04:05:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19637 DF PROTO=TCP SPT=43478 DPT=9101 SEQ=3238666304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B97E600000000001030307) Nov 23 04:05:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39165 DF PROTO=TCP SPT=60912 DPT=9100 SEQ=4066548656 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B98C200000000001030307) Nov 23 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42724 DF PROTO=TCP SPT=51646 DPT=9100 SEQ=1282188036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B997200000000001030307) Nov 23 04:05:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31170 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9A5600000000001030307) Nov 23 04:05:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31171 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9AD600000000001030307) Nov 23 04:05:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36868 DF PROTO=TCP SPT=40230 DPT=9102 SEQ=3651957829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9BA210000000001030307) Nov 23 04:05:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56007 DF PROTO=TCP SPT=46236 DPT=9102 SEQ=420763077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9CE600000000001030307) Nov 23 04:05:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31173 DF PROTO=TCP SPT=35550 DPT=9882 SEQ=811106714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9DE200000000001030307) Nov 23 04:05:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47725 DF PROTO=TCP SPT=35522 DPT=9101 SEQ=2861703325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9E76A0000000001030307) Nov 23 04:05:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8950 DF PROTO=TCP SPT=50774 DPT=9105 SEQ=3687115265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9E81B0000000001030307) Nov 23 04:06:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8952 DF PROTO=TCP SPT=50774 DPT=9105 SEQ=3687115265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2B9F4200000000001030307) Nov 23 04:06:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8886 DF PROTO=TCP SPT=44354 DPT=9100 SEQ=438120719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA00210000000001030307) Nov 23 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14469 DF PROTO=TCP SPT=56816 DPT=9100 SEQ=2930898796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA0C600000000001030307) Nov 23 04:06:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32143 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA1AA10000000001030307) Nov 23 04:06:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32144 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA22A10000000001030307) Nov 23 04:06:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56009 DF PROTO=TCP SPT=46236 DPT=9102 SEQ=420763077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA2E200000000001030307) Nov 23 04:06:20 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Nov 23 04:06:20 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 69516 (conmon) with signal SIGKILL. Nov 23 04:06:20 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Nov 23 04:06:20 localhost systemd[1]: libpod-conmon-5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.scope: Deactivated successfully. Nov 23 04:06:20 localhost podman[111615]: error opening file `/run/crun/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f/status`: No such file or directory Nov 23 04:06:20 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.timer: No such file or directory Nov 23 04:06:20 localhost systemd[1]: 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: Failed to open /run/systemd/transient/5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f.service: No such file or directory Nov 23 04:06:20 localhost podman[111602]: 2025-11-23 09:06:20.279561946 +0000 UTC m=+0.085577426 container cleanup 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 04:06:20 localhost podman[111602]: ovn_metadata_agent Nov 23 04:06:20 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Nov 23 04:06:20 localhost systemd[1]: Stopped ovn_metadata_agent container. Nov 23 04:06:20 localhost python3.9[111710]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:06:21 localhost systemd[1]: Reloading. Nov 23 04:06:21 localhost systemd-sysv-generator[111740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:06:21 localhost systemd-rc-local-generator[111736]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:06:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:06:22 localhost python3.9[111840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4660 DF PROTO=TCP SPT=60044 DPT=9102 SEQ=3991955285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA43A00000000001030307) Nov 23 04:06:23 localhost python3.9[111932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:24 localhost python3.9[112024]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:24 localhost python3.9[112116]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:25 localhost python3.9[112208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:25 localhost python3.9[112300]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:26 localhost python3.9[112392]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32146 DF PROTO=TCP SPT=60372 DPT=9882 SEQ=1472291617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA52210000000001030307) Nov 23 04:06:27 localhost python3.9[112484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:27 localhost python3.9[112576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:28 localhost python3.9[112668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:28 localhost sshd[112728]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:06:28 localhost python3.9[112762]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:29 localhost python3.9[112854]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27570 DF PROTO=TCP SPT=58632 DPT=9101 SEQ=1385897793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA5C9B0000000001030307) Nov 23 04:06:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45440 DF PROTO=TCP SPT=58254 DPT=9105 SEQ=3375843354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA5D4B0000000001030307) Nov 23 04:06:30 localhost python3.9[112946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:30 localhost python3.9[113038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:31 localhost python3.9[113130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:32 localhost python3.9[113222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:32 localhost python3.9[113314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27572 DF PROTO=TCP SPT=58632 DPT=9101 SEQ=1385897793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA68A10000000001030307) Nov 23 04:06:33 localhost python3.9[113406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:33 localhost python3.9[113498]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:34 localhost python3.9[113590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:34 localhost python3.9[113682]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:36 localhost python3.9[113774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42727 DF PROTO=TCP SPT=51646 DPT=9100 SEQ=1282188036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA76200000000001030307) Nov 23 04:06:36 localhost python3.9[113866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:37 localhost python3.9[113958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:38 localhost python3.9[114050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:38 localhost python3.9[114142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:39 localhost python3.9[114234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42173 DF PROTO=TCP SPT=36532 DPT=9100 SEQ=1382933724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA81A00000000001030307) Nov 23 04:06:39 localhost python3.9[114326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:40 localhost python3.9[114418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:40 localhost python3.9[114510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:41 localhost python3.9[114602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:42 localhost python3.9[114694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2509 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA8FE00000000001030307) Nov 23 04:06:42 localhost python3.9[114786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:43 localhost python3.9[114878]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:44 localhost python3.9[114970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:44 localhost python3.9[115062]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2510 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BA97E10000000001030307) Nov 23 04:06:45 localhost python3.9[115154]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:45 localhost python3.9[115246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:46 localhost python3.9[115338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:46 localhost python3.9[115430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:47 localhost python3.9[115522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4662 DF PROTO=TCP SPT=60044 DPT=9102 SEQ=3991955285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAA4210000000001030307) Nov 23 04:06:48 localhost python3.9[115614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:06:50 localhost python3.9[115706]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:51 localhost python3.9[115798]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:06:52 localhost python3.9[115890]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:06:52 localhost systemd[1]: Reloading. Nov 23 04:06:52 localhost systemd-rc-local-generator[115942]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:06:52 localhost systemd-sysv-generator[115946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:06:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:06:52 localhost systemd[1]: tmp-crun.Ptevch.mount: Deactivated successfully. Nov 23 04:06:52 localhost podman[116066]: 2025-11-23 09:06:52.885326377 +0000 UTC m=+0.104343827 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, distribution-scope=public, release=553, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main) Nov 23 04:06:53 localhost podman[116066]: 2025-11-23 09:06:53.00630687 +0000 UTC m=+0.225324380 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, RELEASE=main) Nov 23 04:06:53 localhost python3.9[116154]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54890 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=2193620370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAB8E00000000001030307) Nov 23 04:06:53 localhost python3.9[116308]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:54 localhost python3.9[116434]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:55 localhost python3.9[116542]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:55 localhost python3.9[116635]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:56 localhost python3.9[116728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:56 localhost python3.9[116821]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2512 DF PROTO=TCP SPT=44010 DPT=9882 SEQ=4072214415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAC8200000000001030307) Nov 23 04:06:57 localhost python3.9[116914]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:57 localhost python3.9[117007]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:58 localhost python3.9[117100]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:59 localhost python3.9[117193]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:06:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60681 DF PROTO=TCP SPT=46284 DPT=9101 SEQ=4194018284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAD1C90000000001030307) Nov 23 04:06:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8316 DF PROTO=TCP SPT=39240 DPT=9105 SEQ=2249610514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAD27B0000000001030307) Nov 23 04:07:00 localhost python3.9[117286]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:01 localhost python3.9[117379]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:02 localhost python3.9[117472]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:02 localhost python3.9[117565]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60683 DF PROTO=TCP SPT=46284 DPT=9101 SEQ=4194018284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BADDE10000000001030307) Nov 23 04:07:03 localhost python3.9[117658]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:05 localhost python3.9[117751]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:05 localhost python3.9[117844]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14472 DF PROTO=TCP SPT=56816 DPT=9100 SEQ=2930898796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAEA200000000001030307) Nov 23 04:07:06 localhost python3.9[117937]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:06 localhost python3.9[118030]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:07 localhost python3.9[118123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38609 DF PROTO=TCP SPT=59828 DPT=9100 SEQ=3921597297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BAF6A10000000001030307) Nov 23 04:07:11 localhost systemd[1]: session-37.scope: Deactivated successfully. Nov 23 04:07:11 localhost systemd[1]: session-37.scope: Consumed 46.941s CPU time. Nov 23 04:07:11 localhost systemd-logind[761]: Session 37 logged out. Waiting for processes to exit. Nov 23 04:07:11 localhost systemd-logind[761]: Removed session 37. Nov 23 04:07:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19102 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB04E00000000001030307) Nov 23 04:07:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19103 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB0CE10000000001030307) Nov 23 04:07:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54892 DF PROTO=TCP SPT=54038 DPT=9102 SEQ=2193620370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB1A210000000001030307) Nov 23 04:07:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63455 DF PROTO=TCP SPT=56026 DPT=9102 SEQ=2178988037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB2E200000000001030307) Nov 23 04:07:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19105 DF PROTO=TCP SPT=39486 DPT=9882 SEQ=1997658962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB3C200000000001030307) Nov 23 04:07:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63821 DF PROTO=TCP SPT=59042 DPT=9101 SEQ=1271651755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB46F90000000001030307) Nov 23 04:07:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27142 DF PROTO=TCP SPT=59458 DPT=9105 SEQ=1292689378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB47AB0000000001030307) Nov 23 04:07:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63823 DF PROTO=TCP SPT=59042 DPT=9101 SEQ=1271651755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB53200000000001030307) Nov 23 04:07:36 localhost sshd[118140]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:07:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42176 DF PROTO=TCP SPT=36532 DPT=9100 SEQ=1382933724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB60200000000001030307) Nov 23 04:07:36 localhost systemd-logind[761]: New session 38 of user zuul. Nov 23 04:07:36 localhost systemd[1]: Started Session 38 of User zuul. Nov 23 04:07:37 localhost python3.9[118233]: ansible-ansible.legacy.ping Invoked with data=pong Nov 23 04:07:38 localhost python3.9[118337]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:07:39 localhost python3.9[118429]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54385 DF PROTO=TCP SPT=48914 DPT=9100 SEQ=988562431 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB6BE10000000001030307) Nov 23 04:07:40 localhost python3.9[118522]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:07:40 localhost python3.9[118614]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:07:41 localhost python3.9[118706]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:07:42 localhost python3.9[118779]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763888861.1344411-178-164186714208497/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:07:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15774 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB7A200000000001030307) Nov 23 04:07:43 localhost python3.9[118871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:07:43 localhost python3.9[118967]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:07:44 localhost python3.9[119059]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:07:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15775 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB82200000000001030307) Nov 23 04:07:45 localhost python3.9[119149]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:07:45 localhost network[119166]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:07:45 localhost network[119167]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:07:45 localhost network[119168]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:07:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:07:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63457 DF PROTO=TCP SPT=56026 DPT=9102 SEQ=2178988037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BB8E200000000001030307) Nov 23 04:07:49 localhost python3.9[119365]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:07:50 localhost python3.9[119455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:07:51 localhost python3.9[119551]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:07:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16845 DF PROTO=TCP SPT=45158 DPT=9102 SEQ=3372202955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBA3200000000001030307) Nov 23 04:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15777 DF PROTO=TCP SPT=48214 DPT=9882 SEQ=851862344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBB2210000000001030307) Nov 23 04:07:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2105 DF PROTO=TCP SPT=33312 DPT=9101 SEQ=340619041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBBC2A0000000001030307) Nov 23 04:07:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32588 DF PROTO=TCP SPT=51366 DPT=9105 SEQ=1368777817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBBCDB0000000001030307) Nov 23 04:08:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32590 DF PROTO=TCP SPT=51366 DPT=9105 SEQ=1368777817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBC8E00000000001030307) Nov 23 04:08:05 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 23 04:08:05 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 23 04:08:05 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 23 04:08:05 localhost systemd[1]: sshd.service: Consumed 1.614s CPU time. Nov 23 04:08:05 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 23 04:08:05 localhost systemd[1]: Stopping sshd-keygen.target... Nov 23 04:08:05 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:05 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:05 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:05 localhost systemd[1]: Reached target sshd-keygen.target. Nov 23 04:08:05 localhost systemd[1]: Starting OpenSSH server daemon... Nov 23 04:08:05 localhost sshd[119670]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:08:05 localhost systemd[1]: Started OpenSSH server daemon. Nov 23 04:08:05 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 04:08:05 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 04:08:05 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 04:08:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38612 DF PROTO=TCP SPT=59828 DPT=9100 SEQ=3921597297 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBD4200000000001030307) Nov 23 04:08:06 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 04:08:06 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 04:08:06 localhost systemd[1]: run-r4658003025a440c5865624621cf2eda4.service: Deactivated successfully. Nov 23 04:08:06 localhost systemd[1]: run-r3d6d1e89840d494eac6a4fe4b9820ff2.service: Deactivated successfully. Nov 23 04:08:06 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 23 04:08:06 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 23 04:08:06 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 23 04:08:06 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 23 04:08:06 localhost systemd[1]: Stopping sshd-keygen.target... Nov 23 04:08:06 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:06 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:06 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:08:06 localhost systemd[1]: Reached target sshd-keygen.target. Nov 23 04:08:06 localhost systemd[1]: Starting OpenSSH server daemon... Nov 23 04:08:06 localhost sshd[120095]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:08:06 localhost systemd[1]: Started OpenSSH server daemon. Nov 23 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59732 DF PROTO=TCP SPT=51994 DPT=9100 SEQ=4145928038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBE1200000000001030307) Nov 23 04:08:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42355 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBEB570000000001030307) Nov 23 04:08:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BBF7600000000001030307) Nov 23 04:08:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16847 DF PROTO=TCP SPT=45158 DPT=9102 SEQ=3372202955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC04200000000001030307) Nov 23 04:08:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35865 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=3585187217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC18600000000001030307) Nov 23 04:08:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42359 DF PROTO=TCP SPT=41954 DPT=9882 SEQ=1328253083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC28200000000001030307) Nov 23 04:08:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8146 DF PROTO=TCP SPT=45462 DPT=9101 SEQ=3146966238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC315A0000000001030307) Nov 23 04:08:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25335 DF PROTO=TCP SPT=50892 DPT=9105 SEQ=2536592468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC320B0000000001030307) Nov 23 04:08:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8148 DF PROTO=TCP SPT=45462 DPT=9101 SEQ=3146966238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC3D600000000001030307) Nov 23 04:08:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54388 DF PROTO=TCP SPT=48914 DPT=9100 SEQ=988562431 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC4A200000000001030307) Nov 23 04:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52397 DF PROTO=TCP SPT=37434 DPT=9100 SEQ=1444354613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC56600000000001030307) Nov 23 04:08:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19836 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC60880000000001030307) Nov 23 04:08:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19838 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC6CA00000000001030307) Nov 23 04:08:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35867 DF PROTO=TCP SPT=43132 DPT=9102 SEQ=3585187217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC78200000000001030307) Nov 23 04:08:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45553 DF PROTO=TCP SPT=44470 DPT=9102 SEQ=2349228216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC8DA00000000001030307) Nov 23 04:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19840 DF PROTO=TCP SPT=38490 DPT=9882 SEQ=3546743551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BC9C210000000001030307) Nov 23 04:08:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64365 DF PROTO=TCP SPT=53466 DPT=9101 SEQ=3268082378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCA68A0000000001030307) Nov 23 04:08:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40909 DF PROTO=TCP SPT=33646 DPT=9105 SEQ=3732032219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCA73B0000000001030307) Nov 23 04:09:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64367 DF PROTO=TCP SPT=53466 DPT=9101 SEQ=3268082378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCB2A10000000001030307) Nov 23 04:09:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59735 DF PROTO=TCP SPT=51994 DPT=9100 SEQ=4145928038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCC0210000000001030307) Nov 23 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55183 DF PROTO=TCP SPT=40146 DPT=9100 SEQ=3954503623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCCB610000000001030307) Nov 23 04:09:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18869 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCD5B70000000001030307) Nov 23 04:09:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18871 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCE1A10000000001030307) Nov 23 04:09:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45555 DF PROTO=TCP SPT=44470 DPT=9102 SEQ=2349228216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BCEE210000000001030307) Nov 23 04:09:18 localhost kernel: SELinux: Converting 2753 SID table entries... Nov 23 04:09:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:09:18 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:09:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:09:18 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:09:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:09:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:09:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:09:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9231 DF PROTO=TCP SPT=47990 DPT=9102 SEQ=725927483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD02E00000000001030307) Nov 23 04:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18873 DF PROTO=TCP SPT=43202 DPT=9882 SEQ=4203088774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD12200000000001030307) Nov 23 04:09:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25452 DF PROTO=TCP SPT=58466 DPT=9101 SEQ=3728500903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD1BCC0000000001030307) Nov 23 04:09:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54929 DF PROTO=TCP SPT=45208 DPT=9105 SEQ=1747848640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD1C6B0000000001030307) Nov 23 04:09:30 localhost sshd[120661]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:09:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25454 DF PROTO=TCP SPT=58466 DPT=9101 SEQ=3728500903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD27E00000000001030307) Nov 23 04:09:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52400 DF PROTO=TCP SPT=37434 DPT=9100 SEQ=1444354613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD34210000000001030307) Nov 23 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40258 DF PROTO=TCP SPT=47692 DPT=9100 SEQ=2806583860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD40A00000000001030307) Nov 23 04:09:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22938 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD4EE00000000001030307) Nov 23 04:09:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22939 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD56E00000000001030307) Nov 23 04:09:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9233 DF PROTO=TCP SPT=47990 DPT=9102 SEQ=725927483 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD64200000000001030307) Nov 23 04:09:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21549 DF PROTO=TCP SPT=47754 DPT=9102 SEQ=1955267961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD77E00000000001030307) Nov 23 04:09:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22941 DF PROTO=TCP SPT=54772 DPT=9882 SEQ=48654059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD86200000000001030307) Nov 23 04:09:57 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=17 res=1 Nov 23 04:09:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10183 DF PROTO=TCP SPT=39124 DPT=9101 SEQ=4154972505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD90EA0000000001030307) Nov 23 04:09:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2605 DF PROTO=TCP SPT=41726 DPT=9105 SEQ=737562409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD919B0000000001030307) Nov 23 04:10:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2607 DF PROTO=TCP SPT=41726 DPT=9105 SEQ=737562409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BD9DA00000000001030307) Nov 23 04:10:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55186 DF PROTO=TCP SPT=40146 DPT=9100 SEQ=3954503623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDAA200000000001030307) Nov 23 04:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36949 DF PROTO=TCP SPT=42594 DPT=9100 SEQ=3324275901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDB5E00000000001030307) Nov 23 04:10:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12221 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDC4200000000001030307) Nov 23 04:10:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12222 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDCC200000000001030307) Nov 23 04:10:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21551 DF PROTO=TCP SPT=47754 DPT=9102 SEQ=1955267961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDD8200000000001030307) Nov 23 04:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42051 DF PROTO=TCP SPT=51010 DPT=9102 SEQ=2763510964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDED200000000001030307) Nov 23 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12224 DF PROTO=TCP SPT=48334 DPT=9882 SEQ=1978559223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BDFC200000000001030307) Nov 23 04:10:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53453 DF PROTO=TCP SPT=44628 DPT=9101 SEQ=1408718102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE06190000000001030307) Nov 23 04:10:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8043 DF PROTO=TCP SPT=60518 DPT=9105 SEQ=1221331075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE06CB0000000001030307) Nov 23 04:10:30 localhost python3.9[120818]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:10:31 localhost python3.9[120910]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:10:31 localhost python3.9[120983]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889030.572388-427-229022161023713/.source.fact _original_basename=.96v1mfal follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:10:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53455 DF PROTO=TCP SPT=44628 DPT=9101 SEQ=1408718102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE12200000000001030307) Nov 23 04:10:32 localhost python3.9[121073]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:10:34 localhost python3.9[121171]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:10:35 localhost python3.9[121225]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:10:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40261 DF PROTO=TCP SPT=47692 DPT=9100 SEQ=2806583860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE1E200000000001030307) Nov 23 04:10:38 localhost systemd[1]: Reloading. Nov 23 04:10:38 localhost systemd-sysv-generator[121260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:10:38 localhost systemd-rc-local-generator[121257]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:10:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:10:38 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44916 DF PROTO=TCP SPT=54794 DPT=9100 SEQ=1547639768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE2B200000000001030307) Nov 23 04:10:40 localhost python3.9[121365]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:10:42 localhost python3.9[121604]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Nov 23 04:10:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40894 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE39610000000001030307) Nov 23 04:10:43 localhost python3.9[121696]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Nov 23 04:10:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40895 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE41600000000001030307) Nov 23 04:10:45 localhost python3.9[121789]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:10:46 localhost python3.9[121881]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Nov 23 04:10:48 localhost python3.9[121973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:10:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42053 DF PROTO=TCP SPT=51010 DPT=9102 SEQ=2763510964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE4E200000000001030307) Nov 23 04:10:48 localhost python3.9[122065]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:10:53 localhost python3.9[122138]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889048.4341-751-174432665865727/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:10:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7635 DF PROTO=TCP SPT=52454 DPT=9102 SEQ=3828433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE62600000000001030307) Nov 23 04:10:54 localhost python3.9[122231]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:10:56 localhost python3.9[122325]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Nov 23 04:10:57 localhost python3.9[122418]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Nov 23 04:10:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40897 DF PROTO=TCP SPT=43820 DPT=9882 SEQ=2384301406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE72200000000001030307) Nov 23 04:10:58 localhost python3.9[122511]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 23 04:10:59 localhost python3.9[122609]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Nov 23 04:10:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4894 DF PROTO=TCP SPT=50386 DPT=9101 SEQ=1257906011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE7B4A0000000001030307) Nov 23 04:10:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32158 DF PROTO=TCP SPT=47962 DPT=9105 SEQ=689963143 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE7BFB0000000001030307) Nov 23 04:11:00 localhost python3.9[122757]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:11:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4896 DF PROTO=TCP SPT=50386 DPT=9101 SEQ=1257906011 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE87610000000001030307) Nov 23 04:11:04 localhost python3.9[122907]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36952 DF PROTO=TCP SPT=42594 DPT=9100 SEQ=3324275901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BE94200000000001030307) Nov 23 04:11:08 localhost python3.9[123014]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:11:09 localhost python3.9[123087]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889068.0918384-1024-236790695428880/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14529 DF PROTO=TCP SPT=35280 DPT=9100 SEQ=1392506286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEA0200000000001030307) Nov 23 04:11:10 localhost python3.9[123179]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:11:10 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 23 04:11:10 localhost systemd[1]: Stopped Load Kernel Modules. Nov 23 04:11:10 localhost systemd[1]: Stopping Load Kernel Modules... Nov 23 04:11:10 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 04:11:10 localhost systemd-modules-load[123183]: Module 'msr' is built in Nov 23 04:11:10 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 04:11:11 localhost python3.9[123275]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:11:12 localhost python3.9[123348]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889071.1764095-1093-17762772445675/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2367 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEAE600000000001030307) Nov 23 04:11:13 localhost python3.9[123440]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:11:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2368 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEB6610000000001030307) Nov 23 04:11:17 localhost python3.9[123532]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:11:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7637 DF PROTO=TCP SPT=52454 DPT=9102 SEQ=3828433788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEC2200000000001030307) Nov 23 04:11:18 localhost python3.9[123624]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 23 04:11:19 localhost python3.9[123714]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:11:20 localhost python3.9[123806]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:11:21 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 23 04:11:21 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 23 04:11:21 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 23 04:11:21 localhost systemd[1]: tuned.service: Consumed 1.684s CPU time, no IO. Nov 23 04:11:21 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 23 04:11:22 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 23 04:11:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34201 DF PROTO=TCP SPT=40340 DPT=9102 SEQ=836705543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BED7A10000000001030307) Nov 23 04:11:23 localhost python3.9[123909]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 23 04:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2370 DF PROTO=TCP SPT=53392 DPT=9882 SEQ=3621492434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEE6200000000001030307) Nov 23 04:11:27 localhost python3.9[124001]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:11:27 localhost systemd[1]: Reloading. Nov 23 04:11:27 localhost systemd-rc-local-generator[124027]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:11:27 localhost systemd-sysv-generator[124031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:11:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:11:28 localhost python3.9[124131]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:11:28 localhost systemd[1]: Reloading. Nov 23 04:11:29 localhost systemd-rc-local-generator[124157]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:11:29 localhost systemd-sysv-generator[124161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:11:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:11:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37934 DF PROTO=TCP SPT=50862 DPT=9101 SEQ=128498723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEF07A0000000001030307) Nov 23 04:11:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20275 DF PROTO=TCP SPT=60654 DPT=9105 SEQ=437288969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEF12B0000000001030307) Nov 23 04:11:32 localhost python3.9[124261]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:32 localhost python3.9[124354]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:32 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Nov 23 04:11:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37936 DF PROTO=TCP SPT=50862 DPT=9101 SEQ=128498723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BEFCA00000000001030307) Nov 23 04:11:33 localhost python3.9[124447]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:35 localhost python3.9[124546]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:36 localhost python3.9[124639]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:11:36 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 23 04:11:36 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 23 04:11:36 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 23 04:11:36 localhost systemd[1]: Starting Apply Kernel Variables... Nov 23 04:11:36 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 23 04:11:36 localhost systemd[1]: Finished Apply Kernel Variables. Nov 23 04:11:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44919 DF PROTO=TCP SPT=54794 DPT=9100 SEQ=1547639768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF0A200000000001030307) Nov 23 04:11:36 localhost systemd-logind[761]: Session 38 logged out. Waiting for processes to exit. Nov 23 04:11:36 localhost systemd[1]: session-38.scope: Deactivated successfully. Nov 23 04:11:36 localhost systemd[1]: session-38.scope: Consumed 1min 55.100s CPU time. Nov 23 04:11:36 localhost systemd-logind[761]: Removed session 38. Nov 23 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15692 DF PROTO=TCP SPT=38776 DPT=9100 SEQ=4206993237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF15610000000001030307) Nov 23 04:11:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51629 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF1FA90000000001030307) Nov 23 04:11:42 localhost sshd[124659]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:11:42 localhost systemd-logind[761]: New session 39 of user zuul. Nov 23 04:11:42 localhost systemd[1]: Started Session 39 of User zuul. Nov 23 04:11:43 localhost python3.9[124752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:11:44 localhost python3.9[124846]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:11:44 localhost sshd[124851]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:11:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51631 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF2BA00000000001030307) Nov 23 04:11:46 localhost python3.9[124944]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:47 localhost python3.9[125035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:11:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34203 DF PROTO=TCP SPT=40340 DPT=9102 SEQ=836705543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF38200000000001030307) Nov 23 04:11:48 localhost python3.9[125131]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:11:49 localhost python3.9[125185]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:11:53 localhost python3.9[125279]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:11:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8145 DF PROTO=TCP SPT=37596 DPT=9102 SEQ=3339985482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF4CA10000000001030307) Nov 23 04:11:54 localhost python3.9[125434]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:11:55 localhost python3.9[125526]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:11:56 localhost python3.9[125629]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:11:56 localhost python3.9[125677]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:11:57 localhost python3.9[125769]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:11:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51633 DF PROTO=TCP SPT=50654 DPT=9882 SEQ=3001749266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF5C210000000001030307) Nov 23 04:11:57 localhost python3.9[125842]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889116.7825787-324-136294773177031/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:58 localhost python3.9[125934]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:58 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 23 04:11:58 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:11:58 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:11:58 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:11:58 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:11:59 localhost python3.9[126027]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39234 DF PROTO=TCP SPT=34904 DPT=9101 SEQ=293917994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF65AA0000000001030307) Nov 23 04:11:59 localhost python3.9[126119]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:11:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48078 DF PROTO=TCP SPT=49596 DPT=9105 SEQ=2664954968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF665B0000000001030307) Nov 23 04:12:00 localhost python3.9[126211]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:12:01 localhost python3.9[126301]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:12:02 localhost python3.9[126395]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48080 DF PROTO=TCP SPT=49596 DPT=9105 SEQ=2664954968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF72610000000001030307) Nov 23 04:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:12:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14532 DF PROTO=TCP SPT=35280 DPT=9100 SEQ=1392506286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF7E210000000001030307) Nov 23 04:12:06 localhost python3.9[126521]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64618 DF PROTO=TCP SPT=55576 DPT=9100 SEQ=2967965951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF8AA00000000001030307) Nov 23 04:12:10 localhost python3.9[126660]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42123 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BF94D70000000001030307) Nov 23 04:12:14 localhost python3.9[126760]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42125 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFA0E10000000001030307) Nov 23 04:12:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42126 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFB0A00000000001030307) Nov 23 04:12:19 localhost python3.9[126854]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42907 DF PROTO=TCP SPT=38660 DPT=9102 SEQ=3011424843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFC1E00000000001030307) Nov 23 04:12:23 localhost python3.9[126948]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42127 DF PROTO=TCP SPT=52798 DPT=9882 SEQ=3482548608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFD0200000000001030307) Nov 23 04:12:27 localhost python3.9[127042]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:12:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19981 DF PROTO=TCP SPT=56074 DPT=9101 SEQ=1735143310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFDADA0000000001030307) Nov 23 04:12:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11134 DF PROTO=TCP SPT=43096 DPT=9105 SEQ=4290280602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFDB8B0000000001030307) Nov 23 04:12:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19983 DF PROTO=TCP SPT=56074 DPT=9101 SEQ=1735143310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFE6E00000000001030307) Nov 23 04:12:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15695 DF PROTO=TCP SPT=38776 DPT=9100 SEQ=4206993237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFF4200000000001030307) Nov 23 04:12:38 localhost sshd[127054]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57766 DF PROTO=TCP SPT=43322 DPT=9100 SEQ=2162066529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2BFFFE00000000001030307) Nov 23 04:12:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53595 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C00E200000000001030307) Nov 23 04:12:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53596 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C016200000000001030307) Nov 23 04:12:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42909 DF PROTO=TCP SPT=38660 DPT=9102 SEQ=3011424843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C022200000000001030307) Nov 23 04:12:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39488 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3937336017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C037210000000001030307) Nov 23 04:12:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53598 DF PROTO=TCP SPT=40562 DPT=9882 SEQ=853935234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C046200000000001030307) Nov 23 04:12:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55379 DF PROTO=TCP SPT=49582 DPT=9101 SEQ=171878102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0500A0000000001030307) Nov 23 04:12:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1152 DF PROTO=TCP SPT=41672 DPT=9105 SEQ=2488236121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C050BB0000000001030307) Nov 23 04:13:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55381 DF PROTO=TCP SPT=49582 DPT=9101 SEQ=171878102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C05C210000000001030307) Nov 23 04:13:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64621 DF PROTO=TCP SPT=55576 DPT=9100 SEQ=2967965951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C068200000000001030307) Nov 23 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50431 DF PROTO=TCP SPT=53328 DPT=9100 SEQ=172578527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C074E10000000001030307) Nov 23 04:13:10 localhost python3.9[127234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:13:10 localhost python3.9[127384]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:13:11 localhost python3.9[127487]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763889190.1948092-723-266870941394687/.source.json _original_basename=.a1mc2uzn follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:13:11 localhost podman[127542]: Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.368458539 +0000 UTC m=+0.057860286 container create 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True) Nov 23 04:13:11 localhost systemd[1]: Started libpod-conmon-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope. Nov 23 04:13:11 localhost systemd[1]: Started libcrun container. Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.436078307 +0000 UTC m=+0.125480074 container init 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12) Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.343137714 +0000 UTC m=+0.032539471 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:13:11 localhost systemd[1]: tmp-crun.OYtmwB.mount: Deactivated successfully. Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.448276276 +0000 UTC m=+0.137678043 container start 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , release=553, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, RELEASE=main) Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.448559114 +0000 UTC m=+0.137960911 container attach 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, io.openshift.expose-services=, ceph=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Nov 23 04:13:11 localhost compassionate_hypatia[127559]: 167 167 Nov 23 04:13:11 localhost systemd[1]: libpod-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope: Deactivated successfully. Nov 23 04:13:11 localhost podman[127542]: 2025-11-23 09:13:11.452546127 +0000 UTC m=+0.141947874 container died 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:13:11 localhost podman[127564]: 2025-11-23 09:13:11.532392735 +0000 UTC m=+0.073761520 container remove 1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_hypatia, release=553, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Nov 23 04:13:11 localhost systemd[1]: libpod-conmon-1979ac93966f601b4d0314120fab0d054de8eead080b4f2c1540adb067c3cdf4.scope: Deactivated successfully. Nov 23 04:13:11 localhost podman[127584]: Nov 23 04:13:11 localhost podman[127584]: 2025-11-23 09:13:11.746536587 +0000 UTC m=+0.084376948 container create c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Nov 23 04:13:11 localhost systemd[1]: Started libpod-conmon-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope. Nov 23 04:13:11 localhost podman[127584]: 2025-11-23 09:13:11.713181652 +0000 UTC m=+0.051022043 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:13:11 localhost systemd[1]: Started libcrun container. Nov 23 04:13:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 04:13:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:13:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:13:11 localhost podman[127584]: 2025-11-23 09:13:11.829320025 +0000 UTC m=+0.167160386 container init c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.) Nov 23 04:13:11 localhost podman[127584]: 2025-11-23 09:13:11.83915268 +0000 UTC m=+0.176993041 container start c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main) Nov 23 04:13:11 localhost podman[127584]: 2025-11-23 09:13:11.83948241 +0000 UTC m=+0.177322811 container attach c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:13:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11381 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C07F370000000001030307) Nov 23 04:13:12 localhost systemd[1]: var-lib-containers-storage-overlay-65e17686d33e0674e72e06c177bb6dbf46f78a6084367725a7d3bbdbf9108ce5-merged.mount: Deactivated successfully. Nov 23 04:13:12 localhost gallant_kowalevski[127612]: [ Nov 23 04:13:12 localhost gallant_kowalevski[127612]: { Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "available": false, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "ceph_device": false, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "lsm_data": {}, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "lvs": [], Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "path": "/dev/sr0", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "rejected_reasons": [ Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "Has a FileSystem", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "Insufficient space (<5GB)" Nov 23 04:13:12 localhost gallant_kowalevski[127612]: ], Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "sys_api": { Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "actuators": null, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "device_nodes": "sr0", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "human_readable_size": "482.00 KB", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "id_bus": "ata", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "model": "QEMU DVD-ROM", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "nr_requests": "2", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "partitions": {}, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "path": "/dev/sr0", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "removable": "1", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "rev": "2.5+", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "ro": "0", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "rotational": "1", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "sas_address": "", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "sas_device_handle": "", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "scheduler_mode": "mq-deadline", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "sectors": 0, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "sectorsize": "2048", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "size": 493568.0, Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "support_discard": "0", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "type": "disk", Nov 23 04:13:12 localhost gallant_kowalevski[127612]: "vendor": "QEMU" Nov 23 04:13:12 localhost gallant_kowalevski[127612]: } Nov 23 04:13:12 localhost gallant_kowalevski[127612]: } Nov 23 04:13:12 localhost gallant_kowalevski[127612]: ] Nov 23 04:13:12 localhost systemd[1]: libpod-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope: Deactivated successfully. Nov 23 04:13:12 localhost podman[127584]: 2025-11-23 09:13:12.793516624 +0000 UTC m=+1.131357015 container died c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container) Nov 23 04:13:12 localhost systemd[1]: tmp-crun.dP8qce.mount: Deactivated successfully. Nov 23 04:13:12 localhost systemd[1]: var-lib-containers-storage-overlay-54c1ceb1f7492dd192892745c503ec3b93f62537be87048e862bb2e78f0d58c9-merged.mount: Deactivated successfully. Nov 23 04:13:12 localhost podman[129211]: 2025-11-23 09:13:12.875828387 +0000 UTC m=+0.074073768 container remove c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_kowalevski, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55) Nov 23 04:13:12 localhost systemd[1]: libpod-conmon-c94a36934ef3d561b047b86ef9b260ae680c5ec4f1d925c300947eb872b83398.scope: Deactivated successfully. Nov 23 04:13:13 localhost python3.9[129278]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11383 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C08B200000000001030307) Nov 23 04:13:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39490 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3937336017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C098210000000001030307) Nov 23 04:13:19 localhost podman[129296]: 2025-11-23 09:13:13.555563753 +0000 UTC m=+0.031111626 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 23 04:13:20 localhost python3.9[129507]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10025 DF PROTO=TCP SPT=47812 DPT=9102 SEQ=1161413978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0AC610000000001030307) Nov 23 04:13:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11385 DF PROTO=TCP SPT=44754 DPT=9882 SEQ=3024815876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0BC200000000001030307) Nov 23 04:13:28 localhost podman[129520]: 2025-11-23 09:13:21.083317802 +0000 UTC m=+0.033165160 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:13:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27869 DF PROTO=TCP SPT=51206 DPT=9101 SEQ=1671673324 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0C53A0000000001030307) Nov 23 04:13:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29626 DF PROTO=TCP SPT=43566 DPT=9105 SEQ=1467313874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0C5EB0000000001030307) Nov 23 04:13:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27871 DF PROTO=TCP SPT=51206 DPT=9101 SEQ=1671673324 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0D1610000000001030307) Nov 23 04:13:33 localhost python3.9[129720]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:34 localhost podman[129733]: 2025-11-23 09:13:33.164923116 +0000 UTC m=+0.037336220 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Nov 23 04:13:35 localhost python3.9[129895]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57769 DF PROTO=TCP SPT=43322 DPT=9100 SEQ=2162066529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0DE200000000001030307) Nov 23 04:13:37 localhost podman[129908]: 2025-11-23 09:13:35.958514722 +0000 UTC m=+0.043345236 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:13:38 localhost python3.9[130073]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65212 DF PROTO=TCP SPT=49614 DPT=9100 SEQ=143119108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0EA200000000001030307) Nov 23 04:13:41 localhost podman[130086]: 2025-11-23 09:13:38.366794256 +0000 UTC m=+0.044463220 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 23 04:13:42 localhost python3.9[130263]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 23 04:13:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48386 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C0F8610000000001030307) Nov 23 04:13:44 localhost podman[130276]: 2025-11-23 09:13:42.617797371 +0000 UTC m=+0.046799713 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Nov 23 04:13:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48387 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C100600000000001030307) Nov 23 04:13:45 localhost systemd[1]: session-39.scope: Deactivated successfully. Nov 23 04:13:45 localhost systemd[1]: session-39.scope: Consumed 1min 27.874s CPU time. Nov 23 04:13:45 localhost systemd-logind[761]: Session 39 logged out. Waiting for processes to exit. Nov 23 04:13:45 localhost systemd-logind[761]: Removed session 39. Nov 23 04:13:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10027 DF PROTO=TCP SPT=47812 DPT=9102 SEQ=1161413978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C10C200000000001030307) Nov 23 04:13:51 localhost sshd[130568]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:13:51 localhost systemd-logind[761]: New session 40 of user zuul. Nov 23 04:13:51 localhost systemd[1]: Started Session 40 of User zuul. Nov 23 04:13:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30671 DF PROTO=TCP SPT=52646 DPT=9102 SEQ=2360694826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C121600000000001030307) Nov 23 04:13:54 localhost python3.9[130719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:13:56 localhost python3.9[130815]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Nov 23 04:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48389 DF PROTO=TCP SPT=56700 DPT=9882 SEQ=1000996218 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C130200000000001030307) Nov 23 04:13:57 localhost python3.9[130908]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:13:58 localhost python3.9[130962]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:13:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2348 DF PROTO=TCP SPT=56584 DPT=9101 SEQ=3911532318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C13A6A0000000001030307) Nov 23 04:13:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46040 DF PROTO=TCP SPT=42082 DPT=9105 SEQ=719711906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C13B1B0000000001030307) Nov 23 04:14:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46042 DF PROTO=TCP SPT=42082 DPT=9105 SEQ=719711906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C147200000000001030307) Nov 23 04:14:03 localhost python3.9[131056]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:14:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50434 DF PROTO=TCP SPT=53328 DPT=9100 SEQ=172578527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C154210000000001030307) Nov 23 04:14:07 localhost python3.9[131150]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14709 DF PROTO=TCP SPT=53830 DPT=9100 SEQ=3683994720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C15F600000000001030307) Nov 23 04:14:09 localhost python3.9[131243]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:14:11 localhost python3.9[131335]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Nov 23 04:14:12 localhost kernel: SELinux: Converting 2755 SID table entries... Nov 23 04:14:12 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:14:12 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:14:12 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:14:12 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:14:12 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:14:12 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:14:12 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:14:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53861 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C16DA10000000001030307) Nov 23 04:14:13 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=18 res=1 Nov 23 04:14:14 localhost python3.9[131491]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:14:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53862 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C175A00000000001030307) Nov 23 04:14:15 localhost python3.9[131589]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:14:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30673 DF PROTO=TCP SPT=52646 DPT=9102 SEQ=2360694826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C182210000000001030307) Nov 23 04:14:20 localhost python3.9[131698]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:14:22 localhost python3.9[131943]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 04:14:22 localhost python3.9[132033]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58547 DF PROTO=TCP SPT=36646 DPT=9102 SEQ=2258823922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C196A00000000001030307) Nov 23 04:14:23 localhost python3.9[132127]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53864 DF PROTO=TCP SPT=40180 DPT=9882 SEQ=2067870612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1A6200000000001030307) Nov 23 04:14:27 localhost python3.9[132221]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:14:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29136 DF PROTO=TCP SPT=56520 DPT=9101 SEQ=3907057434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1AF9A0000000001030307) Nov 23 04:14:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13123 DF PROTO=TCP SPT=36824 DPT=9105 SEQ=1887706419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1B04B0000000001030307) Nov 23 04:14:31 localhost python3.9[132315]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 04:14:31 localhost systemd[1]: Reloading. Nov 23 04:14:31 localhost systemd-sysv-generator[132350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:14:31 localhost systemd-rc-local-generator[132346]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:14:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:14:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29138 DF PROTO=TCP SPT=56520 DPT=9101 SEQ=3907057434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1BBA00000000001030307) Nov 23 04:14:33 localhost python3.9[132447]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:14:34 localhost python3.9[132539]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:35 localhost python3.9[132633]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:35 localhost python3.9[132725]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65215 DF PROTO=TCP SPT=49614 DPT=9100 SEQ=143119108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1C8210000000001030307) Nov 23 04:14:36 localhost python3.9[132817]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:14:37 localhost python3.9[132890]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889276.4766986-564-224444163799672/.source _original_basename=.uz1rk4or follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:38 localhost python3.9[132982]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7792 DF PROTO=TCP SPT=60806 DPT=9100 SEQ=202949221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1D4A00000000001030307) Nov 23 04:14:39 localhost python3.9[133074]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Nov 23 04:14:40 localhost python3.9[133166]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:41 localhost python3.9[133258]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:14:41 localhost python3.9[133331]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889280.6226838-690-50264484314571/.source.yaml _original_basename=.2210qgsh follow=False checksum=06d744ebe702728c19f6d1a8f97158d086012058 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:41 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=208.81.1.244 DST=38.102.83.198 LEN=308 TOS=0x08 PREC=0x20 TTL=54 ID=65320 DF PROTO=TCP SPT=443 DPT=52082 SEQ=3804090244 ACK=3369547015 WINDOW=131 RES=0x00 ACK URGP=0 OPT (0101080AA82F48A0D9E26288) Nov 23 04:14:42 localhost python3.9[133423]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Nov 23 04:14:43 localhost ansible-async_wrapper.py[133528]: Invoked with j676011251751 300 /home/zuul/.ansible/tmp/ansible-tmp-1763889282.8782218-762-238711728402442/AnsiballZ_edpm_os_net_config.py _ Nov 23 04:14:43 localhost ansible-async_wrapper.py[133531]: Starting module and watcher Nov 23 04:14:43 localhost ansible-async_wrapper.py[133531]: Start watching 133532 (300) Nov 23 04:14:43 localhost ansible-async_wrapper.py[133532]: Start module (133532) Nov 23 04:14:43 localhost ansible-async_wrapper.py[133528]: Return async_wrapper task started. Nov 23 04:14:43 localhost python3.9[133533]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Nov 23 04:14:44 localhost ansible-async_wrapper.py[133532]: Module complete (133532) Nov 23 04:14:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25870 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1EAE00000000001030307) Nov 23 04:14:47 localhost python3.9[133625]: ansible-ansible.legacy.async_status Invoked with jid=j676011251751.133528 mode=status _async_dir=/root/.ansible_async Nov 23 04:14:47 localhost python3.9[133684]: ansible-ansible.legacy.async_status Invoked with jid=j676011251751.133528 mode=cleanup _async_dir=/root/.ansible_async Nov 23 04:14:48 localhost python3.9[133776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:14:48 localhost ansible-async_wrapper.py[133531]: Done in kid B. Nov 23 04:14:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25871 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C1FAA10000000001030307) Nov 23 04:14:49 localhost python3.9[133849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889288.1842752-828-485485887441/.source.returncode _original_basename=.dsus64tb follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:49 localhost python3.9[133941]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:14:50 localhost python3.9[134014]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889289.448784-876-242957066281776/.source.cfg _original_basename=.t81fwlb4 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:14:51 localhost python3.9[134106]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:14:51 localhost systemd[1]: Reloading Network Manager... Nov 23 04:14:51 localhost NetworkManager[5975]: [1763889291.4559] audit: op="reload" arg="0" pid=134110 uid=0 result="success" Nov 23 04:14:51 localhost NetworkManager[5975]: [1763889291.4571] config: signal: SIGHUP (no changes from disk) Nov 23 04:14:51 localhost systemd[1]: Reloaded Network Manager. Nov 23 04:14:51 localhost systemd[1]: session-40.scope: Deactivated successfully. Nov 23 04:14:51 localhost systemd[1]: session-40.scope: Consumed 34.908s CPU time. Nov 23 04:14:51 localhost systemd-logind[761]: Session 40 logged out. Waiting for processes to exit. Nov 23 04:14:51 localhost systemd-logind[761]: Removed session 40. Nov 23 04:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52279 DF PROTO=TCP SPT=57730 DPT=9102 SEQ=2741214403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C20BE10000000001030307) Nov 23 04:14:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25872 DF PROTO=TCP SPT=59884 DPT=9882 SEQ=81733405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C21A200000000001030307) Nov 23 04:14:57 localhost sshd[134125]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:14:57 localhost systemd-logind[761]: New session 41 of user zuul. Nov 23 04:14:57 localhost systemd[1]: Started Session 41 of User zuul. Nov 23 04:14:58 localhost python3.9[134218]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:14:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12195 DF PROTO=TCP SPT=58778 DPT=9101 SEQ=2273225292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C224CA0000000001030307) Nov 23 04:14:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11887 DF PROTO=TCP SPT=58770 DPT=9105 SEQ=265828421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2257B0000000001030307) Nov 23 04:15:00 localhost python3.9[134312]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:15:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12197 DF PROTO=TCP SPT=58778 DPT=9101 SEQ=2273225292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C230E00000000001030307) Nov 23 04:15:03 localhost python3.9[134465]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:15:04 localhost systemd-logind[761]: Session 41 logged out. Waiting for processes to exit. Nov 23 04:15:04 localhost systemd[1]: session-41.scope: Deactivated successfully. Nov 23 04:15:04 localhost systemd[1]: session-41.scope: Consumed 2.160s CPU time. Nov 23 04:15:04 localhost systemd-logind[761]: Removed session 41. Nov 23 04:15:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14712 DF PROTO=TCP SPT=53830 DPT=9100 SEQ=3683994720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C23E200000000001030307) Nov 23 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37579 DF PROTO=TCP SPT=59010 DPT=9100 SEQ=3929823100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C249A00000000001030307) Nov 23 04:15:10 localhost sshd[134481]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:15:10 localhost systemd-logind[761]: New session 42 of user zuul. Nov 23 04:15:10 localhost systemd[1]: Started Session 42 of User zuul. Nov 23 04:15:11 localhost python3.9[134574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:15:11 localhost sshd[134593]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:15:12 localhost python3.9[134669]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:15:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56783 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C257E00000000001030307) Nov 23 04:15:14 localhost python3.9[134765]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:15:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56784 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C25FE10000000001030307) Nov 23 04:15:15 localhost python3.9[134820]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:15:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52281 DF PROTO=TCP SPT=57730 DPT=9102 SEQ=2741214403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C26C200000000001030307) Nov 23 04:15:19 localhost python3.9[135042]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:15:21 localhost python3.9[135197]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:15:22 localhost python3.9[135289]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:15:23 localhost python3.9[135393]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:15:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58996 DF PROTO=TCP SPT=42950 DPT=9102 SEQ=1523901771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C281210000000001030307) Nov 23 04:15:23 localhost python3.9[135441]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:15:24 localhost python3.9[135533]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:15:24 localhost python3.9[135581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:15:25 localhost python3.9[135673]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:15:26 localhost python3.9[135765]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:15:27 localhost python3.9[135857]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56786 DF PROTO=TCP SPT=49390 DPT=9882 SEQ=2287916637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C290200000000001030307) Nov 23 04:15:27 localhost python3.9[135949]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 23 04:15:29 localhost python3.9[136041]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:15:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22859 DF PROTO=TCP SPT=37668 DPT=9101 SEQ=1362344812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C299FA0000000001030307) Nov 23 04:15:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23427 DF PROTO=TCP SPT=57372 DPT=9105 SEQ=3036593496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C29AAB0000000001030307) Nov 23 04:15:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22861 DF PROTO=TCP SPT=37668 DPT=9101 SEQ=1362344812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2A6200000000001030307) Nov 23 04:15:34 localhost python3.9[136135]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:15:35 localhost python3.9[136229]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:15:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7795 DF PROTO=TCP SPT=60806 DPT=9100 SEQ=202949221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2B2200000000001030307) Nov 23 04:15:35 localhost python3.9[136321]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:15:36 localhost auditd[727]: Audit daemon rotating log files Nov 23 04:15:36 localhost python3.9[136413]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65316 DF PROTO=TCP SPT=36684 DPT=9100 SEQ=1024540960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2BEE00000000001030307) Nov 23 04:15:39 localhost python3.9[136506]: ansible-service_facts Invoked Nov 23 04:15:39 localhost network[136523]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:15:39 localhost network[136524]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:15:39 localhost network[136525]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:15:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:15:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33168 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2C9270000000001030307) Nov 23 04:15:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33170 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2D5200000000001030307) Nov 23 04:15:45 localhost sshd[136649]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:15:46 localhost python3.9[136848]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:15:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58998 DF PROTO=TCP SPT=42950 DPT=9102 SEQ=1523901771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2E2200000000001030307) Nov 23 04:15:51 localhost python3.9[136942]: ansible-package_facts Invoked with manager=['auto'] strategy=first Nov 23 04:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43471 DF PROTO=TCP SPT=51720 DPT=9102 SEQ=2839129972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C2F6200000000001030307) Nov 23 04:15:54 localhost python3.9[137034]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:15:55 localhost python3.9[137109]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889353.2755616-657-115628340651080/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:15:56 localhost python3.9[137203]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:15:56 localhost python3.9[137278]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889355.877056-703-12160223696076/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:15:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33172 DF PROTO=TCP SPT=45594 DPT=9882 SEQ=3886340126 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C306210000000001030307) Nov 23 04:15:58 localhost python3.9[137372]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:15:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35671 DF PROTO=TCP SPT=44574 DPT=9101 SEQ=3990137902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C30F2A0000000001030307) Nov 23 04:15:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42861 DF PROTO=TCP SPT=58772 DPT=9105 SEQ=1394186128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C30FDE0000000001030307) Nov 23 04:16:00 localhost python3.9[137466]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:16:02 localhost python3.9[137520]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:16:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42863 DF PROTO=TCP SPT=58772 DPT=9105 SEQ=1394186128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C31BE00000000001030307) Nov 23 04:16:04 localhost python3.9[137614]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:16:06 localhost python3.9[137668]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:16:06 localhost systemd[1]: Stopping NTP client/server... Nov 23 04:16:06 localhost chronyd[25967]: chronyd exiting Nov 23 04:16:06 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 23 04:16:06 localhost systemd[1]: Stopped NTP client/server. Nov 23 04:16:06 localhost systemd[1]: Starting NTP client/server... Nov 23 04:16:06 localhost chronyd[137676]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 23 04:16:06 localhost chronyd[137676]: Frequency -30.625 +/- 0.724 ppm read from /var/lib/chrony/drift Nov 23 04:16:06 localhost chronyd[137676]: Loaded seccomp filter (level 2) Nov 23 04:16:06 localhost systemd[1]: Started NTP client/server. Nov 23 04:16:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37582 DF PROTO=TCP SPT=59010 DPT=9100 SEQ=3929823100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3283D0000000001030307) Nov 23 04:16:08 localhost systemd[1]: session-42.scope: Deactivated successfully. Nov 23 04:16:08 localhost systemd[1]: session-42.scope: Consumed 27.628s CPU time. Nov 23 04:16:08 localhost systemd-logind[761]: Session 42 logged out. Waiting for processes to exit. Nov 23 04:16:08 localhost systemd-logind[761]: Removed session 42. Nov 23 04:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40600 DF PROTO=TCP SPT=41034 DPT=9100 SEQ=3304842840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C334200000000001030307) Nov 23 04:16:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26166 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C33E590000000001030307) Nov 23 04:16:13 localhost sshd[137692]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:16:14 localhost systemd-logind[761]: New session 43 of user zuul. Nov 23 04:16:14 localhost systemd[1]: Started Session 43 of User zuul. Nov 23 04:16:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26168 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C34A610000000001030307) Nov 23 04:16:15 localhost python3.9[137785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:16:16 localhost python3.9[137881]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:17 localhost python3.9[137986]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43473 DF PROTO=TCP SPT=51720 DPT=9102 SEQ=2839129972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C356200000000001030307) Nov 23 04:16:18 localhost python3.9[138034]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.0gwpw0so recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:20 localhost python3.9[138203]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:20 localhost python3.9[138278]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889379.5927913-144-101025634084363/.source _original_basename=.xsw4d97h follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:21 localhost python3.9[138370]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:16:22 localhost python3.9[138462]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43698 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C36B600000000001030307) Nov 23 04:16:23 localhost python3.9[138535]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889382.4214268-216-117155126347584/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:16:24 localhost python3.9[138627]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:24 localhost python3.9[138700]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889383.8112419-216-45691580572790/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:16:25 localhost python3.9[138792]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:26 localhost python3.9[138884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:26 localhost python3.9[138957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889385.57294-327-182839567029953/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26170 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C37A210000000001030307) Nov 23 04:16:27 localhost python3.9[139049]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:27 localhost python3.9[139122]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889386.8292208-372-183767049529610/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:29 localhost python3.9[139214]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:16:29 localhost systemd[1]: Reloading. Nov 23 04:16:29 localhost systemd-sysv-generator[139244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:16:29 localhost systemd-rc-local-generator[139236]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:16:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:16:29 localhost systemd[1]: Reloading. Nov 23 04:16:29 localhost systemd-sysv-generator[139279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:16:29 localhost systemd-rc-local-generator[139274]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:16:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:16:29 localhost systemd[1]: Starting EDPM Container Shutdown... Nov 23 04:16:29 localhost systemd[1]: Finished EDPM Container Shutdown. Nov 23 04:16:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19957 DF PROTO=TCP SPT=49976 DPT=9101 SEQ=942024125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3845A0000000001030307) Nov 23 04:16:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32280 DF PROTO=TCP SPT=52160 DPT=9105 SEQ=3065586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3850B0000000001030307) Nov 23 04:16:31 localhost python3.9[139385]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:32 localhost python3.9[139458]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889390.9307582-441-198425369348546/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19959 DF PROTO=TCP SPT=49976 DPT=9101 SEQ=942024125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C390610000000001030307) Nov 23 04:16:33 localhost python3.9[139550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:34 localhost python3.9[139623]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889393.0224657-486-73911689883859/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:34 localhost python3.9[139715]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:16:34 localhost systemd[1]: Reloading. Nov 23 04:16:34 localhost systemd-sysv-generator[139742]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:16:34 localhost systemd-rc-local-generator[139739]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:16:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:16:35 localhost systemd[1]: Starting Create netns directory... Nov 23 04:16:35 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:16:35 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:16:35 localhost systemd[1]: Finished Create netns directory. Nov 23 04:16:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65319 DF PROTO=TCP SPT=36684 DPT=9100 SEQ=1024540960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C39E210000000001030307) Nov 23 04:16:36 localhost python3.9[139847]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:16:36 localhost network[139864]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:16:36 localhost network[139865]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:16:36 localhost network[139866]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:16:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56760 DF PROTO=TCP SPT=46368 DPT=9100 SEQ=2594710531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3A9610000000001030307) Nov 23 04:16:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=523 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3B3860000000001030307) Nov 23 04:16:41 localhost python3.9[140067]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:42 localhost python3.9[140142]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889401.3739276-609-48713027567405/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:44 localhost python3.9[140235]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:16:44 localhost systemd[1]: Reloading OpenSSH server daemon... Nov 23 04:16:44 localhost systemd[1]: Reloaded OpenSSH server daemon. Nov 23 04:16:44 localhost sshd[120095]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:16:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=525 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3BFA00000000001030307) Nov 23 04:16:44 localhost python3.9[140331]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:45 localhost python3.9[140423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:46 localhost python3.9[140496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889405.1718009-702-73984414733202/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:47 localhost python3.9[140588]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 23 04:16:47 localhost systemd[1]: Starting Time & Date Service... Nov 23 04:16:47 localhost systemd[1]: Started Time & Date Service. Nov 23 04:16:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43700 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3CC200000000001030307) Nov 23 04:16:48 localhost python3.9[140684]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:50 localhost python3.9[140776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:50 localhost python3.9[140849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889409.586898-807-91240555353654/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:51 localhost python3.9[140941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:51 localhost python3.9[141014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889410.9179804-852-35600893056117/.source.yaml _original_basename=.z1a70904 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:52 localhost python3.9[141106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:53 localhost python3.9[141181]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889412.091772-898-81193597335166/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19009 DF PROTO=TCP SPT=42260 DPT=9102 SEQ=3700835675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3E0A00000000001030307) Nov 23 04:16:53 localhost python3.9[141273]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:16:54 localhost python3.9[141366]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:16:54 localhost sshd[141395]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:16:55 localhost python3[141460]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 23 04:16:56 localhost python3.9[141553]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:56 localhost python3.9[141626]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889415.4855816-1014-248390595810784/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:57 localhost python3.9[141718]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=527 DF PROTO=TCP SPT=43022 DPT=9882 SEQ=1100943462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3F0200000000001030307) Nov 23 04:16:57 localhost python3.9[141791]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889416.7638268-1059-194305208002513/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:58 localhost python3.9[141883]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:58 localhost python3.9[141956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889417.917644-1104-248489580866708/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:16:59 localhost python3.9[142048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:16:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1594 DF PROTO=TCP SPT=58290 DPT=9101 SEQ=2574048636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3F98A0000000001030307) Nov 23 04:16:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42867 DF PROTO=TCP SPT=47352 DPT=9105 SEQ=375247215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C3FA3B0000000001030307) Nov 23 04:17:01 localhost python3.9[142121]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889419.1506188-1149-139431629150104/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:01 localhost python3.9[142213]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:17:02 localhost python3.9[142286]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889421.4095821-1194-33983946343571/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:03 localhost python3.9[142378]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56762 DF PROTO=TCP SPT=46368 DPT=9100 SEQ=2594710531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C40A200000000001030307) Nov 23 04:17:04 localhost python3.9[142470]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:17:05 localhost python3.9[142565]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40603 DF PROTO=TCP SPT=41034 DPT=9100 SEQ=3304842840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C412210000000001030307) Nov 23 04:17:06 localhost python3.9[142658]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:06 localhost python3.9[142750]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:07 localhost python3.9[142842]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 23 04:17:08 localhost python3.9[142935]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 23 04:17:09 localhost systemd[1]: session-43.scope: Deactivated successfully. Nov 23 04:17:09 localhost systemd[1]: session-43.scope: Consumed 27.784s CPU time. Nov 23 04:17:09 localhost systemd-logind[761]: Session 43 logged out. Waiting for processes to exit. Nov 23 04:17:09 localhost systemd-logind[761]: Removed session 43. Nov 23 04:17:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29806 DF PROTO=TCP SPT=50736 DPT=9882 SEQ=2661700928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C428B90000000001030307) Nov 23 04:17:14 localhost sshd[142952]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:17:15 localhost systemd-logind[761]: New session 44 of user zuul. Nov 23 04:17:15 localhost systemd[1]: Started Session 44 of User zuul. Nov 23 04:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26172 DF PROTO=TCP SPT=35120 DPT=9882 SEQ=633500551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C438200000000001030307) Nov 23 04:17:16 localhost python3.9[143047]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 23 04:17:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41390 DF PROTO=TCP SPT=48326 DPT=9102 SEQ=2754492045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C439FB0000000001030307) Nov 23 04:17:17 localhost python3.9[143139]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:17:17 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 23 04:17:19 localhost python3.9[143236]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Nov 23 04:17:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43701 DF PROTO=TCP SPT=43266 DPT=9102 SEQ=3297613898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C44A210000000001030307) Nov 23 04:17:20 localhost systemd[1]: tmp-crun.GkxPKC.mount: Deactivated successfully. Nov 23 04:17:20 localhost podman[143352]: 2025-11-23 09:17:20.523915535 +0000 UTC m=+0.089593207 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:17:20 localhost podman[143352]: 2025-11-23 09:17:20.641509048 +0000 UTC m=+0.207186760 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Nov 23 04:17:21 localhost python3.9[143507]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.6hjawuys follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:17:21 localhost python3.9[143631]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.6hjawuys mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889440.649029-190-211340832161534/.source.6hjawuys _original_basename=.qnu7sme_ follow=False checksum=86d7095ff15f9038e30789829322247c323137f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:24 localhost python3.9[143738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:17:26 localhost python3.9[143830]: ansible-ansible.builtin.blockinfile Invoked with block=np0005532581.localdomain,192.168.122.103,np0005532581* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRibSMIP5+E9lJWuaKDEuCaJoGhGPTqff+o8SP2Twk+NhPOa5FC7WQhHPLXVhKAtlCX60ckYE53Q/H/RVRZ55JdWQLSdY/1tQCD6c0Ry6N+UD+mxo9iN9cHk6vd6J5kJu+v/gBEmFY1A9pjzsD1CTR8gZJHZFqbUTzXrKkoUjK3Kqa8UtvzyhgYQtYIaUwaf1z7CMNQ3A4EaGVKyRsVwb11jlaT9fjB43E3tp9p5EG6PPJEGux/Xea6iHnhSwZHpkD/ylneDOkBbGvYKhL33bpXMcbuHy32jAFr+2Q07sKvgy/b5/f/nTgNCyxEIpoXUbEhX+Vlh+gycU7KJw6FRyR3dQFjooV97NQ/oov2VP9DnTObziZA8lhaJ20ChTfDVUyvFCFi3dKgBUPCeNWCGI69eNHu3dQcwCNJ3kANqhHdkYpBd00PVBritJfxfzH1DCLo0I9CSi1buWYhein9VHZWtzePv/+ucWERRIo+J04QPkV+6P6vgOTRl5U75RctJU=#012np0005532581.localdomain,192.168.122.103,np0005532581* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG7auGqCubvIeT+Z8+DFgAyuqWDpDfRlZtndf8hFQOt7#012np0005532581.localdomain,192.168.122.103,np0005532581* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKGIbd1xFE29cgvdOZ+Uh6ipkdk4QfLnBLiJP+rzeHVtOUTgjR98CvJhrHQdGAxaTty6xRV53oj5EhBdMCJFc5I=#012np0005532584.localdomain,192.168.122.106,np0005532584* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3OrbPXlomvlluk5pGQwXwJu+cR1IMLHg5EnGcI5epB1SB6q/EzlEo5+bOYmmvILsoesUzBIBq21mRhn1Wi2yjlys0pArFDqiLkUBvTW9ro6MKci9Smc12m7AkLus6UO6h3pzqcOdRZQ3KOQDL/83yYJVBCJyqlISXWzzHJpGRVnZHeT4CgKZ1nG5UEvOrtPXRAVWkz3v5TghJrYXvWaPQPmWcEy1rfhCjkCfQY++JB/Dlgammmd1+ZldadeXQi1b2X02a6GFyW0pUMFLjAP7Wr+KcRa5FIPmGwsPuc1NhveAH6zyLrabrh7jPR5O0tBjz9KcNYXbQmJetGt9ZWzFsl0qzXrvI38q5RlGptbqg0iSez61VBAUtnfs33hnYc3dvzJKXReR76PoU3yu/tLrhdK6szqIVsMdw2LGEro7l3KKMKXHSpi8n77fH8ICiU3F5Oif+nvS/e7xr4LccSEnFEHA9PdNxOWxJYLcxTQCt3BkNFrWw4oB1LiDsn98HlS8=#012np0005532584.localdomain,192.168.122.106,np0005532584* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIACxkoVt3BLqmT5JuJibOj2srWJ99rHYxhxT/gCbLdIM#012np0005532584.localdomain,192.168.122.106,np0005532584* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJi5N6oeJPjl3EunvvHi6baJIH9ibE30q8MR/UiZkuoStWh4NAj+cNFWO47723JbHkDzCF1p+3RJ1FLROkiZ4W0=#012np0005532583.localdomain,192.168.122.105,np0005532583* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkB1Cq8AQaEBYTlv5Hzs024jg//D6wieNnvsI5WcYj7wckm9vKTJQfUD6yZBMmyPw6+vVzsM16bj2hagkDR5wkO7uSIaMqWrcoQ1h9HkJQLK8QB0iuzUvQzdr22kUgkLII8thNHK4VxF4VhAKNmzqCofZ4ZSaLUMwauFCFUjx1VJISEZdgYRZ4+++wAN5bdK+WrwSOAHJYJWQX2pRRsPiunSdY1BOUKB3sp7IBcQ3MDJgnKlkR7tiGSYB2W8JsLvIsIb0I2EaqmPUTIzKUuxSJnWEls/WyDT9MNkjhobVeAyFZ5TEik4OvobUhVGJ8CsU7O101KQNQ3IywPM+V0UpjA1yK49z5Qs0LjApmqORsTcjOojYaKGr9n64dVjXdFOMwajB9UmMEFtlIngm6kx7mJQGXqYxVAscW34JY832iKOEzQWrUSdo6mVJ7TXhYYcbdFp+G/128SfhNrbHwKinHeE9Nqu48BR7bmRZXO7ef+UMY1dG3AIvFt4JwFvLihZc=#012np0005532583.localdomain,192.168.122.105,np0005532583* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH4H0HJaVZZzbQbH92x/ePbqiic7VLTV0Kle7XvCiMNK#012np0005532583.localdomain,192.168.122.105,np0005532583* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEb0/1S3v0DC07ZQnLEp9URjtv9BKwGlPRsb47Ua8w+WgbOM0JmtKaPebzMcBow+04/+k7+HcCDBj6p5Yd4q3M4=#012np0005532585.localdomain,192.168.122.107,np0005532585* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCU6ocW8HWtJJyWPSFUqcN5z70XYnNrE5KeWh/VJ4bDkpVePpxxcdD8r8cKL121q0MKPRgia3jLqnKz+o4MH3AqTAWCZamBc1+ePq9OvZDenK69byea8TM176uYzfePjNlud4LSZ6lfkgneO5jeNE6/RcHgBc8Me+2mlzpavioA814r6Ci6hFaEIOS1Zd2b/yKzI4QRl6xg/aJKvlIe9w3G3BvKOG5pixPx2ng4wYc0OMtJb9ItJgZLY92GGuvVRwn9e0D4lab84+x/Nn3XatQdqU69ev7da/bQCUeBivyEZo03olh56YxCKvNfG3ZYwwhMTn9Hg/EdnwrGHYHj0ZgfSR1+Dzvnk0WW/MRs0276Ojj5O0hhnlaAh5n97W6fgHldGKvdEafYeD602C1Zkd+ISqF13W56MWhtUhiUsdUHShnpM/EBOITg6mTDFP1i/qMS0PjRaCzBpdqpJIoKzQpsi4Z3QTHTZ7uK/lqOEaE/wqXHuYlMKcTuOuX33gIp28k=#012np0005532585.localdomain,192.168.122.107,np0005532585* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILpJc3/w4q1RFXE8+NzyjCJ0R7ySeHFy75KPVpy/YiB/#012np0005532585.localdomain,192.168.122.107,np0005532585* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLtz4IM2aQZoQ7CuTS4jfYDH5LZPyutyvm+ZyFuW7jdHvK3umSrNYFwsqiHwWHvM9peuWot0GAUC8rCc1UO+ZWk=#012np0005532586.localdomain,192.168.122.108,np0005532586* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD6U4JggC29IKqxQ7GjhK23AehQb1S2zLryOxLwLEs9rP0qOZpJ9wR1VsBNLXDCmoRVTsH2+3V00hmkvlanKUuzgmLO61hdur+5NQD0xHnY7lOLpOoyR7hJiMuHj/nRgBLWY2OB8Gim121dgfuc2zRF92igDYe65Uf0et83vWlgRmc7KlziaJ91iVcBUmhGYf3Ij7QxfhQH5TTnGoQizdiBpuP+yVuU2AepbvQ8ZFvzioCwzWAVu/xfdRFp9QyLT4JP1jM6dadTjD5RUAjRL6qR1tLXVq/rvqtXSL8ruBSYm3NCOys9RtdrNolZ7frd+zmvF+VzMNLtlRxiuy1ReR+ZO3felB+4TwfEfLZ+DqE1s3+ksCQH/sVCrxzFsRz5lamWG3p78ZBWTiQ/7WdJS1dQOHz+pKNSSW/NYMIqitxsCsEWPJLq/EWoHVxvjREucCb5YvWHPKOv5RLlbm5lSHFLuFVV8O3AAzD/3JsjTbKGOjJhmtxPCgEy7RPqtIUX90s=#012np0005532586.localdomain,192.168.122.108,np0005532586* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKzaUMbW2RXGluOr1nHypPwK+dIm5zaIFHsNA8PEtRqK#012np0005532586.localdomain,192.168.122.108,np0005532586* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLLaE/jo8XH2dLl/mTc9NRhBP3x+ig/gy7tepiJNCqlj0Dgb5vfu6IYaFNrkyisiqhenCsUZQo/guhdX9Nisv9I=#012np0005532582.localdomain,192.168.122.104,np0005532582* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0v47OVdr7YS/5xSUmMc7u26O7OwPomkdDR6s8rrcencbx7seRSeU00QGeRQcJJ023bD3xk26W8iiJTRUDkYSy//cSfHODdDy+CNEfDUTkGzIjiApoLi2b+S4J6wcAldMsj02MZmx67vUHyM5Qwok+22XqopryL8BiGPJbnoUcZy773f5OKPPMNuj3Fyb7jd5mrC7awK4NniZHyHPYBQeBa234HL42fRjcOqCcxuauy5cbz9PeBv5/kg+nYc8cY5qCyLqNhzMVRUa/PcepMBcfThk17LtPGzCYS7IR2cGdUDP6Pe0QD34Hu6+mpwKwYx73v5uHcmy9CeZ8fK83/F84Lr6jxsiwoU2e+hUfzVRq8gnkjk6kuL86eSM2POSGgBYYgCb+Ma6lOkF1MA+rLAh0gAsUhBgVlz6HtaMoDvLOi/NrQeoQyNE1Pv4vPAndmGGc8A7JCtmCMk9VvMy0Ht4IOvtDJFfx1lg7NuMIKqePYTEk56p8wTUNM+BmdJEhFPU=#012np0005532582.localdomain,192.168.122.104,np0005532582* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEaLDeiqlvIGmYCK/pVle4dWQoWUl9JopG1HgV4OQwpm#012np0005532582.localdomain,192.168.122.104,np0005532582* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPG4t0LXPuGTxEFWkant9P4DDIM9mUsBdh3iJHN1QOZUHW9RJuWVAPGkYlb6jz2BktGBRNU2FJD+HyIE3L+OanQ=#012 create=True mode=0644 path=/tmp/ansible.6hjawuys state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:27 localhost python3.9[143922]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.6hjawuys' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:17:28 localhost python3.9[144016]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.6hjawuys state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52718 DF PROTO=TCP SPT=55630 DPT=9101 SEQ=2501216965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C46EBA0000000001030307) Nov 23 04:17:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21550 DF PROTO=TCP SPT=45448 DPT=9105 SEQ=1821250571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C46F6B0000000001030307) Nov 23 04:17:30 localhost systemd[1]: session-44.scope: Deactivated successfully. Nov 23 04:17:30 localhost systemd[1]: session-44.scope: Consumed 4.104s CPU time. Nov 23 04:17:30 localhost systemd-logind[761]: Session 44 logged out. Waiting for processes to exit. Nov 23 04:17:30 localhost systemd-logind[761]: Removed session 44. Nov 23 04:17:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28501 DF PROTO=TCP SPT=56188 DPT=9100 SEQ=2021828236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C477E10000000001030307) Nov 23 04:17:36 localhost sshd[144031]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:17:36 localhost systemd-logind[761]: New session 45 of user zuul. Nov 23 04:17:36 localhost systemd[1]: Started Session 45 of User zuul. Nov 23 04:17:37 localhost python3.9[144124]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:17:38 localhost python3.9[144220]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 04:17:40 localhost python3.9[144314]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:17:41 localhost python3.9[144407]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:17:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42259 DF PROTO=TCP SPT=39490 DPT=9882 SEQ=3692313124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C49DE70000000001030307) Nov 23 04:17:42 localhost python3.9[144500]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:17:42 localhost python3.9[144594]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:17:43 localhost python3.9[144689]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:17:44 localhost systemd[1]: session-45.scope: Deactivated successfully. Nov 23 04:17:44 localhost systemd[1]: session-45.scope: Consumed 3.898s CPU time. Nov 23 04:17:44 localhost systemd-logind[761]: Session 45 logged out. Waiting for processes to exit. Nov 23 04:17:44 localhost systemd-logind[761]: Removed session 45. Nov 23 04:17:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1404 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4AF2B0000000001030307) Nov 23 04:17:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1405 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4B3200000000001030307) Nov 23 04:17:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1406 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4BB200000000001030307) Nov 23 04:17:50 localhost sshd[144704]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:17:50 localhost systemd-logind[761]: New session 46 of user zuul. Nov 23 04:17:50 localhost systemd[1]: Started Session 46 of User zuul. Nov 23 04:17:51 localhost python3.9[144797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:17:52 localhost python3.9[144893]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:17:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1407 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4CAE00000000001030307) Nov 23 04:17:53 localhost python3.9[144947]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 23 04:17:58 localhost python3.9[145039]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:17:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33465 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E3EA0000000001030307) Nov 23 04:17:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39735 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E49B0000000001030307) Nov 23 04:18:00 localhost python3.9[145132]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33466 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E7E00000000001030307) Nov 23 04:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39736 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4E8A10000000001030307) Nov 23 04:18:01 localhost python3.9[145224]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1408 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4EC200000000001030307) Nov 23 04:18:02 localhost python3.9[145316]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26262 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4ED100000000001030307) Nov 23 04:18:02 localhost python3.9[145406]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:18:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39737 DF PROTO=TCP SPT=34974 DPT=9105 SEQ=3316219862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4F0A00000000001030307) Nov 23 04:18:03 localhost python3.9[145496]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:18:04 localhost python3.9[145588]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:18:04 localhost systemd[1]: session-46.scope: Deactivated successfully. Nov 23 04:18:04 localhost systemd[1]: session-46.scope: Consumed 8.857s CPU time. Nov 23 04:18:04 localhost systemd-logind[761]: Session 46 logged out. Waiting for processes to exit. Nov 23 04:18:04 localhost systemd-logind[761]: Removed session 46. Nov 23 04:18:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33468 DF PROTO=TCP SPT=49740 DPT=9101 SEQ=3566375952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C4FFA00000000001030307) Nov 23 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26265 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C508E00000000001030307) Nov 23 04:18:10 localhost sshd[145603]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:18:10 localhost systemd-logind[761]: New session 47 of user zuul. Nov 23 04:18:10 localhost systemd[1]: Started Session 47 of User zuul. Nov 23 04:18:12 localhost python3.9[145696]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:18:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23051 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C517200000000001030307) Nov 23 04:18:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23052 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C51F200000000001030307) Nov 23 04:18:14 localhost python3.9[145792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:15 localhost chronyd[137676]: Selected source 208.81.1.244 (pool.ntp.org) Nov 23 04:18:15 localhost python3.9[145884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:16 localhost python3.9[145957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889495.1274242-187-190268196656080/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:17 localhost python3.9[146049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:17 localhost python3.9[146141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:18 localhost python3.9[146214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889497.1730866-257-41015386399268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1409 DF PROTO=TCP SPT=60074 DPT=9102 SEQ=1544589196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C52C210000000001030307) Nov 23 04:18:18 localhost python3.9[146306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:19 localhost python3.9[146398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:19 localhost python3.9[146471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889498.9804406-328-200131937663724/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:20 localhost python3.9[146563]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:21 localhost python3.9[146655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:21 localhost python3.9[146728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889500.7220058-399-176413160202099/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:22 localhost python3.9[146820]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:22 localhost python3.9[146942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34860 DF PROTO=TCP SPT=43608 DPT=9102 SEQ=2511018635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C540210000000001030307) Nov 23 04:18:23 localhost python3.9[147048]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889502.5241666-468-24465139394778/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:24 localhost python3.9[147155]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:24 localhost python3.9[147247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:25 localhost python3.9[147320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889504.355742-535-270140173196383/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:26 localhost python3.9[147413]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:26 localhost python3.9[147505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:27 localhost python3.9[147578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889506.2064142-603-181960204528158/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23054 DF PROTO=TCP SPT=37928 DPT=9882 SEQ=2886131127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C550200000000001030307) Nov 23 04:18:27 localhost python3.9[147670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:28 localhost python3.9[147762]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:29 localhost python3.9[147835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889508.0596857-675-236799784048460/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=85f08144eae371e6c6843864c2d75f3a0dbb50ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11262 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5591B0000000001030307) Nov 23 04:18:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1189 DF PROTO=TCP SPT=49214 DPT=9105 SEQ=3630460857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C559CB0000000001030307) Nov 23 04:18:30 localhost systemd[1]: session-47.scope: Deactivated successfully. Nov 23 04:18:30 localhost systemd[1]: session-47.scope: Consumed 11.303s CPU time. Nov 23 04:18:30 localhost systemd-logind[761]: Session 47 logged out. Waiting for processes to exit. Nov 23 04:18:30 localhost systemd-logind[761]: Removed session 47. Nov 23 04:18:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11264 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C565210000000001030307) Nov 23 04:18:36 localhost sshd[147851]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:18:36 localhost systemd-logind[761]: New session 48 of user zuul. Nov 23 04:18:36 localhost systemd[1]: Started Session 48 of User zuul. Nov 23 04:18:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11265 DF PROTO=TCP SPT=52842 DPT=9101 SEQ=1941546333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C574E00000000001030307) Nov 23 04:18:36 localhost python3.9[147946]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:38 localhost python3.9[148038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33013 DF PROTO=TCP SPT=35954 DPT=9100 SEQ=1598340131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C57E200000000001030307) Nov 23 04:18:39 localhost python3.9[148111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889518.2519784-63-29923199660148/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=5f137984986c8cf5df5aec7749430e0dc129d0db backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:40 localhost python3.9[148203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:18:41 localhost python3.9[148276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889519.6122842-63-277090203241126/.source.conf _original_basename=ceph.conf follow=False checksum=d6d906a745260c838693e085b1f329bd1daad564 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:18:41 localhost systemd[1]: session-48.scope: Deactivated successfully. Nov 23 04:18:41 localhost systemd[1]: session-48.scope: Consumed 2.204s CPU time. Nov 23 04:18:41 localhost systemd-logind[761]: Session 48 logged out. Waiting for processes to exit. Nov 23 04:18:41 localhost systemd-logind[761]: Removed session 48. Nov 23 04:18:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19783 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C588460000000001030307) Nov 23 04:18:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19785 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C594600000000001030307) Nov 23 04:18:47 localhost sshd[148292]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:18:47 localhost systemd-logind[761]: New session 49 of user zuul. Nov 23 04:18:47 localhost systemd[1]: Started Session 49 of User zuul. Nov 23 04:18:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34862 DF PROTO=TCP SPT=43608 DPT=9102 SEQ=2511018635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5A0210000000001030307) Nov 23 04:18:48 localhost python3.9[148385]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:18:49 localhost python3.9[148481]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:49 localhost python3.9[148573]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:18:51 localhost python3.9[148663]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:18:52 localhost python3.9[148755]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 23 04:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28867 DF PROTO=TCP SPT=33842 DPT=9102 SEQ=2421140341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5B5600000000001030307) Nov 23 04:18:53 localhost python3.9[148847]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:18:54 localhost python3.9[148901]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19787 DF PROTO=TCP SPT=60312 DPT=9882 SEQ=1253814909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5C4200000000001030307) Nov 23 04:18:57 localhost sshd[148904]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:18:59 localhost python3.9[148997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:18:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57340 DF PROTO=TCP SPT=32782 DPT=9101 SEQ=2393264113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5CE490000000001030307) Nov 23 04:18:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5954 DF PROTO=TCP SPT=42362 DPT=9105 SEQ=3315083620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5CEFB0000000001030307) Nov 23 04:19:01 localhost python3[149092]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Nov 23 04:19:02 localhost python3.9[149184]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57342 DF PROTO=TCP SPT=32782 DPT=9101 SEQ=2393264113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5DA600000000001030307) Nov 23 04:19:02 localhost python3.9[149276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:03 localhost python3.9[149324]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:04 localhost python3.9[149416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:04 localhost python3.9[149464]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.27u5gurr recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:06 localhost python3.9[149556]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26268 DF PROTO=TCP SPT=45630 DPT=9100 SEQ=992795737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5E8200000000001030307) Nov 23 04:19:06 localhost python3.9[149604]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:07 localhost python3.9[149696]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:09 localhost python3[149789]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 23 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31382 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=3512850857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5F3200000000001030307) Nov 23 04:19:09 localhost python3.9[149881]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:10 localhost python3.9[149956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889549.3007653-432-252287506334687/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:11 localhost python3.9[150048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:11 localhost python3.9[150123]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889550.679844-477-167595841453505/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61088 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C5FD760000000001030307) Nov 23 04:19:12 localhost python3.9[150215]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:13 localhost python3.9[150290]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889552.3813303-522-128284585097782/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:14 localhost python3.9[150382]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:14 localhost python3.9[150457]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889553.5688262-567-256022277069330/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61090 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C609600000000001030307) Nov 23 04:19:15 localhost python3.9[150549]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:16 localhost python3.9[150624]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889554.892879-612-237908566681366/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28869 DF PROTO=TCP SPT=33842 DPT=9102 SEQ=2421140341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C616200000000001030307) Nov 23 04:19:19 localhost python3.9[150716]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:20 localhost python3.9[150808]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:21 localhost python3.9[150903]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:22 localhost python3.9[150995]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:23 localhost python3.9[151088]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40792 DF PROTO=TCP SPT=52980 DPT=9102 SEQ=3629331533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C62AA00000000001030307) Nov 23 04:19:23 localhost python3.9[151182]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:24 localhost python3.9[151307]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:25 localhost python3.9[151441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:19:27 localhost python3.9[151536]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005532585.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:1d:b8:fa:41" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:27 localhost ovs-vsctl[151537]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005532585.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:1d:b8:fa:41 external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Nov 23 04:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61092 DF PROTO=TCP SPT=58554 DPT=9882 SEQ=73534146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C63A210000000001030307) Nov 23 04:19:27 localhost python3.9[151629]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:19:28 localhost python3.9[151722]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:19:29 localhost python3.9[151816]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34980 DF PROTO=TCP SPT=35690 DPT=9101 SEQ=2587280388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6437A0000000001030307) Nov 23 04:19:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57869 DF PROTO=TCP SPT=40678 DPT=9105 SEQ=182249479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6442B0000000001030307) Nov 23 04:19:30 localhost python3.9[151908]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:30 localhost python3.9[151956]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:31 localhost python3.9[152048]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:31 localhost python3.9[152096]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:32 localhost python3.9[152188]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34982 DF PROTO=TCP SPT=35690 DPT=9101 SEQ=2587280388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C64FA10000000001030307) Nov 23 04:19:33 localhost python3.9[152280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:33 localhost python3.9[152328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:34 localhost python3.9[152420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:35 localhost python3.9[152468]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33016 DF PROTO=TCP SPT=35954 DPT=9100 SEQ=1598340131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C65C200000000001030307) Nov 23 04:19:36 localhost python3.9[152560]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:19:36 localhost systemd[1]: Reloading. Nov 23 04:19:36 localhost systemd-sysv-generator[152587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:19:36 localhost systemd-rc-local-generator[152584]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:19:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:19:37 localhost python3.9[152690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:37 localhost python3.9[152738]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:38 localhost python3.9[152830]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:38 localhost python3.9[152878]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53347 DF PROTO=TCP SPT=38346 DPT=9100 SEQ=2464325010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C668600000000001030307) Nov 23 04:19:39 localhost python3.9[152970]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:19:39 localhost systemd[1]: Reloading. Nov 23 04:19:39 localhost systemd-rc-local-generator[152996]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:19:39 localhost systemd-sysv-generator[152999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:19:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:19:40 localhost systemd[1]: Starting Create netns directory... Nov 23 04:19:40 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:19:40 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:19:40 localhost systemd[1]: Finished Create netns directory. Nov 23 04:19:41 localhost python3.9[153104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:41 localhost python3.9[153196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2261 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C672A60000000001030307) Nov 23 04:19:42 localhost python3.9[153269]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889581.2415519-1344-260646082026101/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:43 localhost python3.9[153361]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:19:43 localhost python3.9[153453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:19:44 localhost python3.9[153528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889583.5034494-1419-171512185753192/.source.json _original_basename=.4ot7ghoo follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2263 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C67EA10000000001030307) Nov 23 04:19:45 localhost python3.9[153620]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40794 DF PROTO=TCP SPT=52980 DPT=9102 SEQ=3629331533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C68A200000000001030307) Nov 23 04:19:48 localhost python3.9[153877]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Nov 23 04:19:49 localhost python3.9[153969]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:19:50 localhost python3.9[154061]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 04:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55080 DF PROTO=TCP SPT=52796 DPT=9102 SEQ=3989621481 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C69FA00000000001030307) Nov 23 04:19:54 localhost python3[154182]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:19:55 localhost python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c",#012 "Digest": "sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:40:43.504967825Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 345731014,#012 "VirtualSize": 345731014,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0ff11ed3154c8bbd91096301c9cfc5b95bbe726d99c5650ba8d355053fb0bbad/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/d16160b7dcc2f7ec400dce38b825ab93d5279c0ca0a9a7ff351e435b4aeeea92/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:2e0f9ca9a8387a3566096aacaecfe5797e3fc2585f07cb97a1706897fa1a86a3",#012 "sha256:db37b2d335b44e6a9cb2eb88713051bc469233d1e0a06670f1303bc9539b97a0"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:39.924297673Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-li Nov 23 04:19:55 localhost podman[154232]: 2025-11-23 09:19:55.346636879 +0000 UTC m=+0.092538965 container remove 99d48ec37530a8be4ce691a640ccf2bddc96b337743696367f12dd18b2029a23 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:19:55 localhost python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Nov 23 04:19:55 localhost podman[154246]: Nov 23 04:19:55 localhost podman[154246]: 2025-11-23 09:19:55.456348282 +0000 UTC m=+0.091462912 container create 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:19:55 localhost podman[154246]: 2025-11-23 09:19:55.413827684 +0000 UTC m=+0.048942354 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 23 04:19:55 localhost python3[154182]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 23 04:19:56 localhost python3.9[154374]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2265 DF PROTO=TCP SPT=44926 DPT=9882 SEQ=3902380751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6AE210000000001030307) Nov 23 04:19:59 localhost python3.9[154468]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:19:59 localhost python3.9[154514]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:19:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29676 DF PROTO=TCP SPT=51840 DPT=9101 SEQ=568539744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6B8AA0000000001030307) Nov 23 04:19:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5710 DF PROTO=TCP SPT=59366 DPT=9105 SEQ=1415999924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6B95B0000000001030307) Nov 23 04:20:00 localhost python3.9[154605]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763889599.659171-1683-122901520042008/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:01 localhost python3.9[154651]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:20:01 localhost systemd[1]: Reloading. Nov 23 04:20:01 localhost systemd-sysv-generator[154679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:20:01 localhost systemd-rc-local-generator[154674]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:20:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:20:02 localhost python3.9[154733]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:20:02 localhost systemd[1]: Reloading. Nov 23 04:20:02 localhost systemd-rc-local-generator[154759]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:20:02 localhost systemd-sysv-generator[154765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:20:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:20:02 localhost systemd[1]: Starting ovn_controller container... Nov 23 04:20:02 localhost systemd[1]: Started libcrun container. Nov 23 04:20:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b7408e2b9210e95a255d08712fe6a0aa83c4e2a632b9f510c8ec27c22bd8bbc/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 23 04:20:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:20:02 localhost podman[154774]: 2025-11-23 09:20:02.636359143 +0000 UTC m=+0.131960868 container init 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 04:20:02 localhost systemd[1]: tmp-crun.wolQKz.mount: Deactivated successfully. Nov 23 04:20:02 localhost ovn_controller[154788]: + sudo -E kolla_set_configs Nov 23 04:20:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:20:02 localhost podman[154774]: 2025-11-23 09:20:02.669459806 +0000 UTC m=+0.165061491 container start 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:20:02 localhost edpm-start-podman-container[154774]: ovn_controller Nov 23 04:20:02 localhost systemd[1]: Created slice User Slice of UID 0. Nov 23 04:20:02 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 23 04:20:02 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 23 04:20:02 localhost systemd[1]: Starting User Manager for UID 0... Nov 23 04:20:02 localhost edpm-start-podman-container[154773]: Creating additional drop-in dependency for "ovn_controller" (2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543) Nov 23 04:20:02 localhost systemd[1]: Reloading. Nov 23 04:20:02 localhost podman[154795]: 2025-11-23 09:20:02.827451552 +0000 UTC m=+0.152550712 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:20:02 localhost podman[154795]: 2025-11-23 09:20:02.843147468 +0000 UTC m=+0.168246668 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:20:02 localhost podman[154795]: unhealthy Nov 23 04:20:02 localhost systemd[154817]: Queued start job for default target Main User Target. Nov 23 04:20:02 localhost systemd[154817]: Created slice User Application Slice. Nov 23 04:20:02 localhost systemd[154817]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 23 04:20:02 localhost systemd[154817]: Started Daily Cleanup of User's Temporary Directories. Nov 23 04:20:02 localhost systemd[154817]: Reached target Paths. Nov 23 04:20:02 localhost systemd[154817]: Reached target Timers. Nov 23 04:20:02 localhost systemd[154817]: Starting D-Bus User Message Bus Socket... Nov 23 04:20:02 localhost systemd[154817]: Starting Create User's Volatile Files and Directories... Nov 23 04:20:02 localhost systemd-rc-local-generator[154872]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:20:02 localhost systemd[154817]: Finished Create User's Volatile Files and Directories. Nov 23 04:20:02 localhost systemd[154817]: Listening on D-Bus User Message Bus Socket. Nov 23 04:20:02 localhost systemd[154817]: Reached target Sockets. Nov 23 04:20:02 localhost systemd[154817]: Reached target Basic System. Nov 23 04:20:02 localhost systemd[154817]: Reached target Main User Target. Nov 23 04:20:02 localhost systemd[154817]: Startup finished in 132ms. Nov 23 04:20:02 localhost systemd-sysv-generator[154875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:20:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:20:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5712 DF PROTO=TCP SPT=59366 DPT=9105 SEQ=1415999924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6C5600000000001030307) Nov 23 04:20:03 localhost systemd[1]: Started User Manager for UID 0. Nov 23 04:20:03 localhost systemd[1]: Started ovn_controller container. Nov 23 04:20:03 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:20:03 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Failed with result 'exit-code'. Nov 23 04:20:03 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation. Nov 23 04:20:03 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:20:03 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:20:03 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:20:03 localhost systemd[1]: Started Session c12 of User root. Nov 23 04:20:03 localhost ovn_controller[154788]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:20:03 localhost ovn_controller[154788]: INFO:__main__:Validating config file Nov 23 04:20:03 localhost ovn_controller[154788]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:20:03 localhost ovn_controller[154788]: INFO:__main__:Writing out command to execute Nov 23 04:20:03 localhost systemd[1]: session-c12.scope: Deactivated successfully. Nov 23 04:20:03 localhost ovn_controller[154788]: ++ cat /run_command Nov 23 04:20:03 localhost ovn_controller[154788]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 23 04:20:03 localhost ovn_controller[154788]: + ARGS= Nov 23 04:20:03 localhost ovn_controller[154788]: + sudo kolla_copy_cacerts Nov 23 04:20:03 localhost systemd[1]: Started Session c13 of User root. Nov 23 04:20:03 localhost ovn_controller[154788]: + [[ ! -n '' ]] Nov 23 04:20:03 localhost ovn_controller[154788]: + . kolla_extend_start Nov 23 04:20:03 localhost ovn_controller[154788]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 23 04:20:03 localhost ovn_controller[154788]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Nov 23 04:20:03 localhost ovn_controller[154788]: + umask 0022 Nov 23 04:20:03 localhost ovn_controller[154788]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Nov 23 04:20:03 localhost systemd[1]: session-c13.scope: Deactivated successfully. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8] Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00004|main|INFO|OVS IDL reconnected, force recompute. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00013|main|INFO|OVS feature set changed, force recompute. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00021|main|INFO|OVS feature set changed, force recompute. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-c237ed-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-49b8a0-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-b7d5b3-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00026|binding|INFO|Claiming lport d3912d14-a3e0-4df9-b811-f3bd90f44559 for this chassis. Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00027|binding|INFO|d3912d14-a3e0-4df9-b811-f3bd90f44559: Claiming fa:16:3e:cf:aa:3b 192.168.0.77 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00028|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00029|binding|INFO|Removing lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-c237ed-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-49b8a0-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-b7d5b3-0 Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00033|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00034|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00035|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00036|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00037|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:03 localhost ovn_controller[154788]: 2025-11-23T09:20:03Z|00038|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:04 localhost ovn_controller[154788]: 2025-11-23T09:20:04Z|00039|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:04 localhost ovn_controller[154788]: 2025-11-23T09:20:04Z|00040|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:04 localhost python3.9[154990]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:20:05 localhost ovs-vsctl[154991]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Nov 23 04:20:05 localhost ovn_controller[154788]: 2025-11-23T09:20:05Z|00041|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:05 localhost ovn_controller[154788]: 2025-11-23T09:20:05Z|00042|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:20:05 localhost python3.9[155083]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:20:05 localhost ovs-vsctl[155085]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Nov 23 04:20:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31385 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=3512850857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6D2210000000001030307) Nov 23 04:20:06 localhost python3.9[155178]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:20:06 localhost ovs-vsctl[155179]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Nov 23 04:20:07 localhost systemd-logind[761]: Session 49 logged out. Waiting for processes to exit. Nov 23 04:20:07 localhost systemd[1]: session-49.scope: Deactivated successfully. Nov 23 04:20:07 localhost systemd[1]: session-49.scope: Consumed 39.948s CPU time. Nov 23 04:20:07 localhost systemd-logind[761]: Removed session 49. Nov 23 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40269 DF PROTO=TCP SPT=32894 DPT=9100 SEQ=2119202913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6DDA00000000001030307) Nov 23 04:20:11 localhost ovn_controller[154788]: 2025-11-23T09:20:11Z|00043|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS Nov 23 04:20:11 localhost ovn_controller[154788]: 2025-11-23T09:20:11Z|00044|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 up in Southbound Nov 23 04:20:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59514 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6E7D70000000001030307) Nov 23 04:20:13 localhost sshd[155195]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:20:13 localhost systemd-logind[761]: New session 51 of user zuul. Nov 23 04:20:13 localhost systemd[1]: Started Session 51 of User zuul. Nov 23 04:20:13 localhost systemd[1]: Stopping User Manager for UID 0... Nov 23 04:20:13 localhost systemd[154817]: Activating special unit Exit the Session... Nov 23 04:20:13 localhost systemd[154817]: Stopped target Main User Target. Nov 23 04:20:13 localhost systemd[154817]: Stopped target Basic System. Nov 23 04:20:13 localhost systemd[154817]: Stopped target Paths. Nov 23 04:20:13 localhost systemd[154817]: Stopped target Sockets. Nov 23 04:20:13 localhost systemd[154817]: Stopped target Timers. Nov 23 04:20:13 localhost systemd[154817]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 04:20:13 localhost systemd[154817]: Closed D-Bus User Message Bus Socket. Nov 23 04:20:13 localhost systemd[154817]: Stopped Create User's Volatile Files and Directories. Nov 23 04:20:13 localhost systemd[154817]: Removed slice User Application Slice. Nov 23 04:20:13 localhost systemd[154817]: Reached target Shutdown. Nov 23 04:20:13 localhost systemd[154817]: Finished Exit the Session. Nov 23 04:20:13 localhost systemd[154817]: Reached target Exit the Session. Nov 23 04:20:13 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 23 04:20:13 localhost systemd[1]: Stopped User Manager for UID 0. Nov 23 04:20:13 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 23 04:20:13 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 23 04:20:13 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 23 04:20:13 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 23 04:20:13 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 23 04:20:14 localhost python3.9[155289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:20:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59516 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C6F3E00000000001030307) Nov 23 04:20:15 localhost python3.9[155385]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:16 localhost python3.9[155477]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:16 localhost python3.9[155569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:17 localhost python3.9[155661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:17 localhost python3.9[155753]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55082 DF PROTO=TCP SPT=52796 DPT=9102 SEQ=3989621481 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C700210000000001030307) Nov 23 04:20:18 localhost python3.9[155843]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:20:19 localhost python3.9[155935]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 23 04:20:20 localhost python3.9[156025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:21 localhost python3.9[156098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889619.9685419-219-65904435578925/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:21 localhost python3.9[156189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:22 localhost python3.9[156262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889621.4241328-264-30766783328851/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:23 localhost python3.9[156354]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:20:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57049 DF PROTO=TCP SPT=41364 DPT=9102 SEQ=4036052507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C714E00000000001030307) Nov 23 04:20:24 localhost python3.9[156408]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59518 DF PROTO=TCP SPT=32770 DPT=9882 SEQ=1739129171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C724200000000001030307) Nov 23 04:20:29 localhost python3.9[156578]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:20:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27028 DF PROTO=TCP SPT=51376 DPT=9101 SEQ=1016082811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C72DDA0000000001030307) Nov 23 04:20:29 localhost python3.9[156671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27969 DF PROTO=TCP SPT=33576 DPT=9105 SEQ=1273410697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C72E8B0000000001030307) Nov 23 04:20:30 localhost python3.9[156742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889629.414031-375-44334779818017/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:30 localhost python3.9[156832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:31 localhost python3.9[156903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889630.449876-375-191040011858470/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27030 DF PROTO=TCP SPT=51376 DPT=9101 SEQ=1016082811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C739E10000000001030307) Nov 23 04:20:33 localhost python3.9[156993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:33 localhost python3.9[157064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889632.6645074-507-2814951783767/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:20:34 localhost podman[157155]: 2025-11-23 09:20:34.034509806 +0000 UTC m=+0.085221728 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:20:34 localhost ovn_controller[154788]: 2025-11-23T09:20:34Z|00045|memory|INFO|16912 kB peak resident set size after 30.8 seconds Nov 23 04:20:34 localhost ovn_controller[154788]: 2025-11-23T09:20:34Z|00046|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:288 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:153 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67 Nov 23 04:20:34 localhost python3.9[157154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:34 localhost podman[157155]: 2025-11-23 09:20:34.107243646 +0000 UTC m=+0.157955548 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:20:34 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:20:34 localhost python3.9[157250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889633.6388807-507-194698171034653/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:35 localhost python3.9[157340]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:20:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53350 DF PROTO=TCP SPT=38346 DPT=9100 SEQ=2464325010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C746200000000001030307) Nov 23 04:20:36 localhost python3.9[157434]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:36 localhost python3.9[157526]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:37 localhost python3.9[157574]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:37 localhost python3.9[157666]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:38 localhost python3.9[157714]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:39 localhost python3.9[157806]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39601 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=2691335382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C752E10000000001030307) Nov 23 04:20:39 localhost python3.9[157898]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:40 localhost python3.9[157946]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:41 localhost python3.9[158038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:41 localhost ovn_controller[154788]: 2025-11-23T09:20:41Z|00047|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory Nov 23 04:20:41 localhost python3.9[158086]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10290 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C75D060000000001030307) Nov 23 04:20:42 localhost python3.9[158178]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:20:42 localhost systemd[1]: Reloading. Nov 23 04:20:42 localhost systemd-rc-local-generator[158204]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:20:42 localhost systemd-sysv-generator[158210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:20:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:20:43 localhost python3.9[158309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:43 localhost python3.9[158357]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:44 localhost python3.9[158449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10292 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C769200000000001030307) Nov 23 04:20:45 localhost python3.9[158497]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:47 localhost python3.9[158589]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:20:47 localhost systemd[1]: Reloading. Nov 23 04:20:47 localhost systemd-rc-local-generator[158611]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:20:47 localhost systemd-sysv-generator[158615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:20:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:20:47 localhost systemd[1]: Starting Create netns directory... Nov 23 04:20:47 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:20:47 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:20:47 localhost systemd[1]: Finished Create netns directory. Nov 23 04:20:48 localhost python3.9[158723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57051 DF PROTO=TCP SPT=41364 DPT=9102 SEQ=4036052507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C776200000000001030307) Nov 23 04:20:49 localhost python3.9[158815]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:49 localhost python3.9[158888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763889648.5463421-960-192936594959607/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:50 localhost python3.9[158980]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:20:51 localhost python3.9[159072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:20:52 localhost python3.9[159147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889650.7320113-1035-275893690886433/.source.json _original_basename=.631w_e77 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:52 localhost python3.9[159239]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53478 DF PROTO=TCP SPT=38202 DPT=9102 SEQ=3774868968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C78A210000000001030307) Nov 23 04:20:55 localhost python3.9[159496]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Nov 23 04:20:56 localhost python3.9[159588]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:20:57 localhost python3.9[159680]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 04:20:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10294 DF PROTO=TCP SPT=34348 DPT=9882 SEQ=1978032639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C79A210000000001030307) Nov 23 04:20:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14322 DF PROTO=TCP SPT=45352 DPT=9101 SEQ=1901109381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7A30A0000000001030307) Nov 23 04:20:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29006 DF PROTO=TCP SPT=59772 DPT=9105 SEQ=2659339466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7A3BB0000000001030307) Nov 23 04:21:01 localhost python3[159799]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:21:01 localhost python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9",#012 "Digest": "sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:31:40.431364621Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784198911,#012 "VirtualSize": 784198911,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/7f2203bb12e8263bda654cf994f0f457cee81cbd85ac1474f70f0dcdab850bfc/diff:/var/lib/containers/storage/overlay/cb3e7a7b413bc69102c7d8435b32176125a43d794f5f87a0ef3f45710221e344/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/94bc28862446c9d52bea5cb761ece58f4b7ce0b6f4d30585ec973efe7d5007f7/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:03228f16e908b0892695bcc077f4378f9669ff86bd51a3747df5ce9269c56477",#012 "sha256:1bc9c5b4c351caaeaa6b900805b43669e78b079f06d9048393517dd05690b8dc",#012 "sha256:83d6638c009d9ced6da21e0f659e23221a9a8d7c283582e370f21a7551100a49"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf Nov 23 04:21:01 localhost podman[159851]: 2025-11-23 09:21:01.813237534 +0000 UTC m=+0.065844018 container remove 5be9e17ca5afaf3e3f6bfde25d4e7c94140fa7480033fa2171de030c572f034f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a43bf0e2ecc9c9d02be7a27eac338b4c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 23 04:21:01 localhost python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Nov 23 04:21:01 localhost podman[159864]: Nov 23 04:21:01 localhost podman[159864]: 2025-11-23 09:21:01.916592011 +0000 UTC m=+0.087201968 container create 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:21:01 localhost podman[159864]: 2025-11-23 09:21:01.874142098 +0000 UTC m=+0.044752105 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:21:01 localhost python3[159799]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:21:02 localhost python3.9[159994]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:21:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14324 DF PROTO=TCP SPT=45352 DPT=9101 SEQ=1901109381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7AF200000000001030307) Nov 23 04:21:04 localhost python3.9[160088]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:21:04 localhost podman[160135]: 2025-11-23 09:21:04.349908626 +0000 UTC m=+0.087058805 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller) Nov 23 04:21:04 localhost podman[160135]: 2025-11-23 09:21:04.416412283 +0000 UTC m=+0.153562452 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2) Nov 23 04:21:04 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:21:04 localhost python3.9[160134]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:21:05 localhost python3.9[160251]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763889664.5191727-1299-147141301336279/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:06 localhost python3.9[160297]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:21:06 localhost systemd[1]: Reloading. Nov 23 04:21:06 localhost systemd-rc-local-generator[160318]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:06 localhost systemd-sysv-generator[160321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40272 DF PROTO=TCP SPT=32894 DPT=9100 SEQ=2119202913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7BC200000000001030307) Nov 23 04:21:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:07 localhost python3.9[160378]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:07 localhost systemd[1]: Reloading. Nov 23 04:21:07 localhost systemd-rc-local-generator[160405]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:07 localhost systemd-sysv-generator[160408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:07 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 23 04:21:07 localhost systemd[1]: tmp-crun.bYdoGC.mount: Deactivated successfully. Nov 23 04:21:07 localhost systemd[1]: Started libcrun container. Nov 23 04:21:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77181651e1832fcae1b91722194a2a35e769c1643eafd1391688458a62aabf61/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 23 04:21:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77181651e1832fcae1b91722194a2a35e769c1643eafd1391688458a62aabf61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:21:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:21:07 localhost podman[160420]: 2025-11-23 09:21:07.588933646 +0000 UTC m=+0.147056671 container init 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + sudo -E kolla_set_configs Nov 23 04:21:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:21:07 localhost podman[160420]: 2025-11-23 09:21:07.619826842 +0000 UTC m=+0.177949787 container start 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:21:07 localhost edpm-start-podman-container[160420]: ovn_metadata_agent Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Validating config file Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Copying service configuration files Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Writing out command to execute Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: ++ cat /run_command Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + CMD=neutron-ovn-metadata-agent Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + ARGS= Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + sudo kolla_copy_cacerts Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + [[ ! -n '' ]] Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + . kolla_extend_start Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: Running command: 'neutron-ovn-metadata-agent' Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + umask 0022 Nov 23 04:21:07 localhost ovn_metadata_agent[160434]: + exec neutron-ovn-metadata-agent Nov 23 04:21:07 localhost edpm-start-podman-container[160419]: Creating additional drop-in dependency for "ovn_metadata_agent" (9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e) Nov 23 04:21:07 localhost systemd[1]: Reloading. Nov 23 04:21:07 localhost systemd-sysv-generator[160510]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:07 localhost systemd-rc-local-generator[160504]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:07 localhost podman[160443]: 2025-11-23 09:21:07.771860665 +0000 UTC m=+0.146391260 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 04:21:07 localhost podman[160443]: 2025-11-23 09:21:07.800446759 +0000 UTC m=+0.174977314 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 23 04:21:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:07 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:21:07 localhost systemd[1]: Started ovn_metadata_agent container. Nov 23 04:21:08 localhost systemd-logind[761]: Session 51 logged out. Waiting for processes to exit. Nov 23 04:21:08 localhost systemd[1]: session-51.scope: Deactivated successfully. Nov 23 04:21:08 localhost systemd[1]: session-51.scope: Consumed 30.137s CPU time. Nov 23 04:21:08 localhost systemd-logind[761]: Removed session 51. Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.197 160439 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.198 160439 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.199 160439 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.200 160439 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.201 160439 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.202 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.203 160439 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.204 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.205 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.206 160439 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.207 160439 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.208 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.209 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.210 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.211 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.212 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.213 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.214 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.215 160439 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.216 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.217 160439 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.218 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.219 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.220 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.221 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.222 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.223 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.224 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.225 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.226 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.227 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.228 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.229 160439 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.237 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Nov 23 04:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52272 DF PROTO=TCP SPT=36808 DPT=9100 SEQ=427320919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7C7E00000000001030307) Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.252 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 26f986a7-6ac7-4ec2-887b-8da6da04a661 (UUID: 26f986a7-6ac7-4ec2-887b-8da6da04a661) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.269 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.270 160439 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.271 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.275 160439 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.287 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005532585.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.288 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '26f986a7-6ac7-4ec2-887b-8da6da04a661'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '1338671a-acb4-5368-872c-4c2204284319', 'neutron:ovn-metadata-sb-cfg': '1'}, name=26f986a7-6ac7-4ec2-887b-8da6da04a661, nb_cfg_timestamp=1763889611858, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.289 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 bound to our chassis on insert#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.289 160439 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.290 160439 INFO oslo_service.service [-] Starting 1 workers#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.293 160439 DEBUG oslo_service.service [-] Started child 160537 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.295 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bcac49fc-c589-475a-91a8-00a0ba9c2b33#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.296 160537 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-242845'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.296 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpkqnv75g4/privsep.sock']#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.310 160537 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.311 160537 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.311 160537 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.316 160537 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.322 160537 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.331 160537 INFO eventlet.wsgi.server [-] (160537) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.894 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.895 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkqnv75g4/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.797 160542 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.802 160542 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.805 160542 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.806 160542 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160542#033[00m Nov 23 04:21:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:09.897 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3e87f35a-0aa3-4e6e-b78d-5b2fe079c1ab]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:10 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:21:10 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:21:10 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:10.280 160542 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:21:10 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:10.731 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3ce10a-f0e4-4307-bde3-3c39859cff0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:10 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:10.732 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpicyi7c_h/privsep.sock']#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.362 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.363 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpicyi7c_h/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.255 160553 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.261 160553 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.264 160553 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.264 160553 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160553#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.366 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[154dcbcb-2615-429f-b846-df0a4dc7b9f6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:21:11 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:11.831 160553 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:21:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13325 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7D2360000000001030307) Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.292 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[0c48477b-0dd2-4fc7-a2cd-7b8676cec6c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.295 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[1972be4d-4812-484a-bb9b-7fe832cf6220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.310 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[db1fd844-2273-4905-8614-f065b2d1b2dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.323 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ad8c74-3e45-4b0e-83e8-7934ea0a9db1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628644, 'reachable_time': 31397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 160563, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.336 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c5be5abd-97cb-4042-9d03-49f5e496d618]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbcac49fc-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628651, 'tstamp': 628651}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapbcac49fc-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628653, 'tstamp': 628653}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628654, 'tstamp': 628654}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:b28b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 628644, 'tstamp': 628644}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160564, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.378 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[caedf812-a3ab-438c-a3eb-656ccd9d86be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.380 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.417 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcac49fc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.418 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.419 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbcac49fc-c0, col_values=(('external_ids', {'iface-id': '98ef2da5-f5cb-44e8-a4b2-f6178c6c8332'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.420 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:21:12 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.424 160439 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpbi_yr_1l/privsep.sock']#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.024 160439 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.025 160439 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbi_yr_1l/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.921 160573 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.925 160573 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.927 160573 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:12.927 160573 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160573#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.028 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[0d406e78-198b-496c-bb63-fe08f0476fd2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.469 160573 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:21:13 localhost sshd[160578]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:21:13 localhost systemd-logind[761]: New session 52 of user zuul. Nov 23 04:21:13 localhost systemd[1]: Started Session 52 of User zuul. Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.925 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[f41794b4-b99c-4960-8787-6741fa1738fe]: (4, ['ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.928 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, column=external_ids, values=({'neutron:ovn-metadata-id': '1338671a-acb4-5368-872c-4c2204284319'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.929 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.930 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.943 160439 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.944 160439 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.945 160439 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.946 160439 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.947 160439 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.948 160439 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.949 160439 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.950 160439 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.951 160439 DEBUG oslo_service.service [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.952 160439 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.953 160439 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.955 160439 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.956 160439 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.957 160439 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.958 160439 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.959 160439 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.960 160439 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.961 160439 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.962 160439 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.963 160439 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.964 160439 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.965 160439 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.966 160439 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.967 160439 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.968 160439 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.969 160439 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.970 160439 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.971 160439 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.972 160439 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.973 160439 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.974 160439 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.975 160439 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.976 160439 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.977 160439 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.978 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.979 160439 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.980 160439 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.981 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.982 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.983 160439 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.984 160439 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.985 160439 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.986 160439 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.987 160439 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.988 160439 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.989 160439 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.990 160439 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.991 160439 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.992 160439 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.993 160439 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.994 160439 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.995 160439 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.996 160439 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.997 160439 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.998 160439 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:13.999 160439 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.000 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.001 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.002 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.003 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.004 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:21:14 localhost ovn_metadata_agent[160434]: 2025-11-23 09:21:14.005 160439 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 23 04:21:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13327 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7DE200000000001030307) Nov 23 04:21:15 localhost python3.9[160671]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:21:17 localhost python3.9[160767]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53480 DF PROTO=TCP SPT=38202 DPT=9102 SEQ=3774868968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7EA200000000001030307) Nov 23 04:21:18 localhost python3.9[160872]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:18 localhost systemd[1]: libpod-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope: Deactivated successfully. Nov 23 04:21:18 localhost podman[160873]: 2025-11-23 09:21:18.648504784 +0000 UTC m=+0.302301251 container died 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 23 04:21:18 localhost systemd[1]: tmp-crun.fzYe6d.mount: Deactivated successfully. Nov 23 04:21:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1-userdata-shm.mount: Deactivated successfully. Nov 23 04:21:18 localhost podman[160873]: 2025-11-23 09:21:18.686458789 +0000 UTC m=+0.340255166 container cleanup 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=) Nov 23 04:21:18 localhost podman[160886]: 2025-11-23 09:21:18.733043421 +0000 UTC m=+0.078742958 container remove 65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, release=1761123044) Nov 23 04:21:18 localhost systemd[1]: libpod-conmon-65e9c4e5cd73f8c03abb38ec9e8f9964a357d646b9216aaad8716af1c2fc5ed1.scope: Deactivated successfully. Nov 23 04:21:19 localhost systemd[1]: var-lib-containers-storage-overlay-69a650d13aa75802cba83b72930ac053de367a681b210a8b1cff6ce21a4c09bf-merged.mount: Deactivated successfully. Nov 23 04:21:20 localhost python3.9[160992]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:21:20 localhost systemd[1]: Reloading. Nov 23 04:21:20 localhost systemd-rc-local-generator[161015]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:20 localhost systemd-sysv-generator[161022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:21 localhost python3.9[161118]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:21:21 localhost network[161135]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:21:21 localhost network[161136]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:21:21 localhost network[161137]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:21:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3497 DF PROTO=TCP SPT=60650 DPT=9102 SEQ=2952448551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C7FF600000000001030307) Nov 23 04:21:24 localhost sshd[161261]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13329 DF PROTO=TCP SPT=35752 DPT=9882 SEQ=1833969550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C80E200000000001030307) Nov 23 04:21:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35494 DF PROTO=TCP SPT=39434 DPT=9101 SEQ=550781359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8183A0000000001030307) Nov 23 04:21:29 localhost python3.9[161417]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:29 localhost systemd[1]: Reloading. Nov 23 04:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31760 DF PROTO=TCP SPT=56020 DPT=9105 SEQ=1694727563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C818FB0000000001030307) Nov 23 04:21:30 localhost systemd-sysv-generator[161448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:30 localhost systemd-rc-local-generator[161441]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:30 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Nov 23 04:21:30 localhost python3.9[161549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:32 localhost python3.9[161642]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35496 DF PROTO=TCP SPT=39434 DPT=9101 SEQ=550781359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C824600000000001030307) Nov 23 04:21:33 localhost python3.9[161735]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:33 localhost python3.9[161828]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:34 localhost python3.9[161921]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:21:34 localhost podman[161923]: 2025-11-23 09:21:34.836681041 +0000 UTC m=+0.086447188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:21:34 localhost podman[161923]: 2025-11-23 09:21:34.876562911 +0000 UTC m=+0.126329028 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller) Nov 23 04:21:34 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:21:35 localhost python3.9[162037]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:21:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39604 DF PROTO=TCP SPT=47358 DPT=9100 SEQ=2691335382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C832200000000001030307) Nov 23 04:21:37 localhost python3.9[162130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:21:38 localhost systemd[1]: tmp-crun.CXIW7p.mount: Deactivated successfully. Nov 23 04:21:38 localhost podman[162223]: 2025-11-23 09:21:38.156869725 +0000 UTC m=+0.088208862 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 23 04:21:38 localhost podman[162223]: 2025-11-23 09:21:38.190202341 +0000 UTC m=+0.121541498 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 04:21:38 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:21:38 localhost python3.9[162222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:38 localhost python3.9[162333]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4052 DF PROTO=TCP SPT=40358 DPT=9100 SEQ=3390558412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C83D210000000001030307) Nov 23 04:21:39 localhost python3.9[162425]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:39 localhost python3.9[162517]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:40 localhost python3.9[162609]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:41 localhost python3.9[162701]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1755 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C847660000000001030307) Nov 23 04:21:42 localhost python3.9[162793]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:42 localhost python3.9[162885]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:43 localhost python3.9[162977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:44 localhost python3.9[163069]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:44 localhost python3.9[163161]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1757 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C853600000000001030307) Nov 23 04:21:45 localhost python3.9[163253]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:46 localhost python3.9[163345]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:21:47 localhost python3.9[163437]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3499 DF PROTO=TCP SPT=60650 DPT=9102 SEQ=2952448551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C860210000000001030307) Nov 23 04:21:48 localhost python3.9[163529]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:21:49 localhost python3.9[163621]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:21:49 localhost systemd[1]: Reloading. Nov 23 04:21:49 localhost systemd-sysv-generator[163651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:21:49 localhost systemd-rc-local-generator[163642]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:21:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:21:50 localhost python3.9[163749]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:50 localhost python3.9[163842]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:52 localhost python3.9[163935]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:53 localhost python3.9[164028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55191 DF PROTO=TCP SPT=60090 DPT=9102 SEQ=1164980385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C874600000000001030307) Nov 23 04:21:53 localhost python3.9[164121]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:54 localhost python3.9[164214]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:54 localhost python3.9[164307]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:21:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1759 DF PROTO=TCP SPT=53198 DPT=9882 SEQ=2658206013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C884210000000001030307) Nov 23 04:21:57 localhost python3.9[164400]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Nov 23 04:21:58 localhost python3.9[164493]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 23 04:21:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26471 DF PROTO=TCP SPT=51708 DPT=9101 SEQ=510542425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C88D690000000001030307) Nov 23 04:21:59 localhost python3.9[164591]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 23 04:21:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37785 DF PROTO=TCP SPT=56578 DPT=9105 SEQ=30648972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C88E1A0000000001030307) Nov 23 04:22:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:22:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.006 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55d7b81262d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Nov 23 04:22:01 localhost python3.9[164691]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:22:02 localhost python3.9[164745]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:22:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37787 DF PROTO=TCP SPT=56578 DPT=9105 SEQ=30648972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C89A210000000001030307) Nov 23 04:22:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:22:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.010 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563779dfe2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Nov 23 04:22:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:22:05 localhost podman[164748]: 2025-11-23 09:22:05.024546282 +0000 UTC m=+0.082242767 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:22:05 localhost podman[164748]: 2025-11-23 09:22:05.058211429 +0000 UTC m=+0.115907974 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 23 04:22:05 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:22:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52275 DF PROTO=TCP SPT=36808 DPT=9100 SEQ=427320919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8A6200000000001030307) Nov 23 04:22:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:22:09 localhost podman[164838]: 2025-11-23 09:22:09.017388717 +0000 UTC m=+0.077427007 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:22:09 localhost podman[164838]: 2025-11-23 09:22:09.047182814 +0000 UTC m=+0.107221124 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 23 04:22:09 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:22:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:22:09.232 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:22:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:22:09.232 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:22:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:22:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46843 DF PROTO=TCP SPT=35548 DPT=9100 SEQ=608490598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8B2600000000001030307) Nov 23 04:22:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28026 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8BC960000000001030307) Nov 23 04:22:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28028 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8C8A10000000001030307) Nov 23 04:22:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55193 DF PROTO=TCP SPT=60090 DPT=9102 SEQ=1164980385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8D4200000000001030307) Nov 23 04:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47656 DF PROTO=TCP SPT=40302 DPT=9102 SEQ=2900724902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8E9A00000000001030307) Nov 23 04:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28030 DF PROTO=TCP SPT=46152 DPT=9882 SEQ=2891797171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C8F8210000000001030307) Nov 23 04:22:27 localhost kernel: SELinux: Converting 2758 SID table entries... Nov 23 04:22:27 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Nov 23 04:22:27 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:22:27 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:22:27 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:22:27 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:22:27 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:22:27 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:22:27 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:22:28 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=19 res=1 Nov 23 04:22:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29977 DF PROTO=TCP SPT=57732 DPT=9101 SEQ=168636228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9029A0000000001030307) Nov 23 04:22:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49170 DF PROTO=TCP SPT=58140 DPT=9105 SEQ=3084951181 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9034B0000000001030307) Nov 23 04:22:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29979 DF PROTO=TCP SPT=57732 DPT=9101 SEQ=168636228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C90EA00000000001030307) Nov 23 04:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:22:36 localhost systemd[1]: tmp-crun.gERxBu.mount: Deactivated successfully. Nov 23 04:22:36 localhost podman[165983]: 2025-11-23 09:22:36.069054941 +0000 UTC m=+0.106506460 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller) Nov 23 04:22:36 localhost podman[165983]: 2025-11-23 09:22:36.136503218 +0000 UTC m=+0.173954707 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:22:36 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:22:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4055 DF PROTO=TCP SPT=40358 DPT=9100 SEQ=3390558412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C91C210000000001030307) Nov 23 04:22:38 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 23 04:22:38 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:22:38 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:22:38 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:22:38 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:22:38 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:22:38 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:22:38 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25888 DF PROTO=TCP SPT=46556 DPT=9100 SEQ=2284853579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C927A00000000001030307) Nov 23 04:22:39 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=20 res=1 Nov 23 04:22:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:22:40 localhost podman[166014]: 2025-11-23 09:22:40.019560797 +0000 UTC m=+0.070931887 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:22:40 localhost podman[166014]: 2025-11-23 09:22:40.054203131 +0000 UTC m=+0.105574241 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 23 04:22:40 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:22:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18373 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C931C70000000001030307) Nov 23 04:22:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18375 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C93DE00000000001030307) Nov 23 04:22:46 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 23 04:22:46 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:22:46 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:22:46 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:22:46 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:22:46 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:22:46 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:22:46 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:22:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47658 DF PROTO=TCP SPT=40302 DPT=9102 SEQ=2900724902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C94A200000000001030307) Nov 23 04:22:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62044 DF PROTO=TCP SPT=35598 DPT=9102 SEQ=3206288983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C95EE00000000001030307) Nov 23 04:22:54 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 23 04:22:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:22:55 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:22:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:22:55 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:22:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:22:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:22:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:22:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18377 DF PROTO=TCP SPT=49144 DPT=9882 SEQ=2198007924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C96E210000000001030307) Nov 23 04:22:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31582 DF PROTO=TCP SPT=52862 DPT=9101 SEQ=3900244220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C977C90000000001030307) Nov 23 04:22:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5733 DF PROTO=TCP SPT=59552 DPT=9105 SEQ=880615477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9787B0000000001030307) Nov 23 04:23:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31584 DF PROTO=TCP SPT=52862 DPT=9101 SEQ=3900244220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C983E00000000001030307) Nov 23 04:23:04 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 23 04:23:04 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:23:04 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:23:04 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:23:04 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:23:04 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:23:04 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:23:04 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:23:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46846 DF PROTO=TCP SPT=35548 DPT=9100 SEQ=608490598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C990210000000001030307) Nov 23 04:23:06 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=23 res=1 Nov 23 04:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:23:07 localhost podman[166062]: 2025-11-23 09:23:07.03483678 +0000 UTC m=+0.085263450 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:23:07 localhost podman[166062]: 2025-11-23 09:23:07.107047393 +0000 UTC m=+0.157474133 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2) Nov 23 04:23:07 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:23:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:23:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:23:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:23:09.233 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:23:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:23:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54931 DF PROTO=TCP SPT=53956 DPT=9100 SEQ=1081423421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C99CA00000000001030307) Nov 23 04:23:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:23:11 localhost podman[166087]: 2025-11-23 09:23:11.027827125 +0000 UTC m=+0.085156715 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 23 04:23:11 localhost podman[166087]: 2025-11-23 09:23:11.056506819 +0000 UTC m=+0.113836409 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:23:11 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:23:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19877 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9A6F70000000001030307) Nov 23 04:23:12 localhost kernel: SELinux: Converting 2761 SID table entries... Nov 23 04:23:12 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:23:12 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:23:12 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:23:12 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:23:12 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:23:12 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:23:12 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:23:13 localhost systemd[1]: Reloading. Nov 23 04:23:13 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=24 res=1 Nov 23 04:23:13 localhost systemd-sysv-generator[166140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:23:13 localhost systemd-rc-local-generator[166135]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:23:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:23:13 localhost systemd[1]: Reloading. Nov 23 04:23:13 localhost systemd-rc-local-generator[166176]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:23:13 localhost systemd-sysv-generator[166179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:23:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:23:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19879 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9B2E10000000001030307) Nov 23 04:23:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62046 DF PROTO=TCP SPT=35598 DPT=9102 SEQ=3206288983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9C0200000000001030307) Nov 23 04:23:22 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 23 04:23:22 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 23 04:23:22 localhost kernel: SELinux: policy capability open_perms=1 Nov 23 04:23:22 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 23 04:23:22 localhost kernel: SELinux: policy capability always_check_network=0 Nov 23 04:23:22 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 23 04:23:22 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 23 04:23:22 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 23 04:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56073 DF PROTO=TCP SPT=42040 DPT=9102 SEQ=1421973694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9D4210000000001030307) Nov 23 04:23:26 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 04:23:26 localhost dbus-broker-launch[756]: avc: op=load_policy lsm=selinux seqno=25 res=1 Nov 23 04:23:26 localhost dbus-broker-launch[752]: Noticed file-system modification, trigger reload. Nov 23 04:23:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19881 DF PROTO=TCP SPT=46712 DPT=9882 SEQ=2888539505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9E2210000000001030307) Nov 23 04:23:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33924 DF PROTO=TCP SPT=46538 DPT=9101 SEQ=4155092695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9ECFA0000000001030307) Nov 23 04:23:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32017 DF PROTO=TCP SPT=55642 DPT=9105 SEQ=149405895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9EDAB0000000001030307) Nov 23 04:23:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33926 DF PROTO=TCP SPT=46538 DPT=9101 SEQ=4155092695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2C9F9200000000001030307) Nov 23 04:23:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25891 DF PROTO=TCP SPT=46556 DPT=9100 SEQ=2284853579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA06210000000001030307) Nov 23 04:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:23:38 localhost systemd[1]: tmp-crun.MCZC0x.mount: Deactivated successfully. Nov 23 04:23:38 localhost podman[166514]: 2025-11-23 09:23:38.059188179 +0000 UTC m=+0.109371221 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3) Nov 23 04:23:38 localhost podman[166514]: 2025-11-23 09:23:38.172073651 +0000 UTC m=+0.222256673 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:23:38 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41340 DF PROTO=TCP SPT=56312 DPT=9100 SEQ=2314062371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA11E00000000001030307) Nov 23 04:23:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=183 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA1C260000000001030307) Nov 23 04:23:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:23:42 localhost systemd[1]: tmp-crun.fmW4Zq.mount: Deactivated successfully. Nov 23 04:23:42 localhost podman[167798]: 2025-11-23 09:23:42.035061345 +0000 UTC m=+0.081769002 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:23:42 localhost podman[167798]: 2025-11-23 09:23:42.041521225 +0000 UTC m=+0.088228912 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 23 04:23:42 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:23:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=185 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA28200000000001030307) Nov 23 04:23:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56075 DF PROTO=TCP SPT=42040 DPT=9102 SEQ=1421973694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA34200000000001030307) Nov 23 04:23:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36347 DF PROTO=TCP SPT=49520 DPT=9102 SEQ=3145602236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA49200000000001030307) Nov 23 04:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=187 DF PROTO=TCP SPT=35542 DPT=9882 SEQ=1332333384 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA58200000000001030307) Nov 23 04:23:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25764 DF PROTO=TCP SPT=40478 DPT=9101 SEQ=2537767173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA622A0000000001030307) Nov 23 04:23:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5139 DF PROTO=TCP SPT=47666 DPT=9105 SEQ=3170194503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA62DB0000000001030307) Nov 23 04:24:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5141 DF PROTO=TCP SPT=47666 DPT=9105 SEQ=3170194503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA6EE00000000001030307) Nov 23 04:24:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54934 DF PROTO=TCP SPT=53956 DPT=9100 SEQ=1081423421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA7A200000000001030307) Nov 23 04:24:07 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 23 04:24:07 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 23 04:24:07 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 23 04:24:07 localhost systemd[1]: sshd.service: Consumed 1.215s CPU time, read 32.0K from disk, written 0B to disk. Nov 23 04:24:07 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 23 04:24:07 localhost systemd[1]: Stopping sshd-keygen.target... Nov 23 04:24:07 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:24:07 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:24:07 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 23 04:24:07 localhost systemd[1]: Reached target sshd-keygen.target. Nov 23 04:24:07 localhost systemd[1]: Starting OpenSSH server daemon... Nov 23 04:24:07 localhost sshd[184045]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:24:07 localhost systemd[1]: Started OpenSSH server daemon. Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:24:08 localhost podman[184073]: 2025-11-23 09:24:08.3170077 +0000 UTC m=+0.086887140 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118) Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost podman[184073]: 2025-11-23 09:24:08.435432161 +0000 UTC m=+0.205311631 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller) Nov 23 04:24:08 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:24:09.234 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:24:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:24:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:24:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:24:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28877 DF PROTO=TCP SPT=55906 DPT=9100 SEQ=14938984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA87200000000001030307) Nov 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:09 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 04:24:09 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 04:24:09 localhost systemd[1]: Reloading. Nov 23 04:24:10 localhost systemd-sysv-generator[184304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:10 localhost systemd-rc-local-generator[184298]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:10 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 04:24:10 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 04:24:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64444 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA91570000000001030307) Nov 23 04:24:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:24:12 localhost systemd[1]: tmp-crun.hra9Ch.mount: Deactivated successfully. Nov 23 04:24:12 localhost podman[187632]: 2025-11-23 09:24:12.5482311 +0000 UTC m=+0.100654795 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 04:24:12 localhost podman[187632]: 2025-11-23 09:24:12.585484058 +0000 UTC m=+0.137907753 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:24:12 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:24:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64446 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CA9D600000000001030307) Nov 23 04:24:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36349 DF PROTO=TCP SPT=49520 DPT=9102 SEQ=3145602236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAAA210000000001030307) Nov 23 04:24:21 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 04:24:21 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 04:24:21 localhost systemd[1]: man-db-cache-update.service: Consumed 13.822s CPU time. Nov 23 04:24:21 localhost systemd[1]: run-rfd34d12b1e6647b585e2c3a7a1d21fe5.service: Deactivated successfully. Nov 23 04:24:21 localhost systemd[1]: run-r90e79007a5d8497ca59a5df0a1fbebd9.service: Deactivated successfully. Nov 23 04:24:21 localhost python3.9[192901]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:24:21 localhost systemd[1]: Reloading. Nov 23 04:24:21 localhost systemd-rc-local-generator[192931]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:21 localhost systemd-sysv-generator[192935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost python3.9[193051]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:24:22 localhost systemd[1]: Reloading. Nov 23 04:24:22 localhost systemd-rc-local-generator[193077]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:22 localhost systemd-sysv-generator[193084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:22 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27512 DF PROTO=TCP SPT=60458 DPT=9102 SEQ=2990301430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CABE600000000001030307) Nov 23 04:24:24 localhost python3.9[193200]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:24:24 localhost systemd[1]: Reloading. Nov 23 04:24:24 localhost systemd-rc-local-generator[193225]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:24 localhost systemd-sysv-generator[193228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:25 localhost python3.9[193349]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:24:26 localhost systemd[1]: Reloading. Nov 23 04:24:26 localhost systemd-sysv-generator[193383]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:26 localhost systemd-rc-local-generator[193380]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:26 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64448 DF PROTO=TCP SPT=58034 DPT=9882 SEQ=1980338375 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CACE200000000001030307) Nov 23 04:24:28 localhost python3.9[193498]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:28 localhost systemd[1]: Reloading. Nov 23 04:24:28 localhost systemd-rc-local-generator[193526]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:28 localhost systemd-sysv-generator[193532]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost python3.9[193647]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:29 localhost systemd[1]: Reloading. Nov 23 04:24:29 localhost systemd-sysv-generator[193676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:29 localhost systemd-rc-local-generator[193672]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38193 DF PROTO=TCP SPT=34058 DPT=9101 SEQ=1688640829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAD7590000000001030307) Nov 23 04:24:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34530 DF PROTO=TCP SPT=39396 DPT=9105 SEQ=3013725068 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAD80B0000000001030307) Nov 23 04:24:30 localhost python3.9[193796]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:30 localhost systemd[1]: Reloading. Nov 23 04:24:30 localhost systemd-rc-local-generator[193823]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:30 localhost systemd-sysv-generator[193828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:31 localhost python3.9[193945]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:32 localhost python3.9[194058]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:32 localhost systemd[1]: Reloading. Nov 23 04:24:32 localhost systemd-sysv-generator[194087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:32 localhost systemd-rc-local-generator[194084]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38195 DF PROTO=TCP SPT=34058 DPT=9101 SEQ=1688640829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAE3600000000001030307) Nov 23 04:24:33 localhost python3.9[194207]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 23 04:24:33 localhost systemd[1]: Reloading. Nov 23 04:24:33 localhost systemd-rc-local-generator[194251]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:24:33 localhost systemd-sysv-generator[194256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:33 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:24:34 localhost python3.9[194409]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41343 DF PROTO=TCP SPT=56312 DPT=9100 SEQ=2314062371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAF0210000000001030307) Nov 23 04:24:36 localhost python3.9[194554]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:37 localhost python3.9[194667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:24:38 localhost systemd[1]: tmp-crun.rbHpv5.mount: Deactivated successfully. Nov 23 04:24:38 localhost podman[194780]: 2025-11-23 09:24:38.787918201 +0000 UTC m=+0.097776246 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:24:38 localhost podman[194780]: 2025-11-23 09:24:38.864195592 +0000 UTC m=+0.174053677 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:24:38 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:24:38 localhost python3.9[194781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61661 DF PROTO=TCP SPT=44452 DPT=9100 SEQ=3680421593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CAFC600000000001030307) Nov 23 04:24:39 localhost python3.9[194919]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:40 localhost python3.9[195032]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:41 localhost python3.9[195145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63862 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB06860000000001030307) Nov 23 04:24:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:24:42 localhost podman[195259]: 2025-11-23 09:24:42.841444171 +0000 UTC m=+0.084099074 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:24:42 localhost podman[195259]: 2025-11-23 09:24:42.87545951 +0000 UTC m=+0.118114453 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:24:42 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:24:43 localhost python3.9[195258]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:43 localhost python3.9[195390]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63864 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB12A10000000001030307) Nov 23 04:24:45 localhost python3.9[195503]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:46 localhost python3.9[195616]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:47 localhost python3.9[195729]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:47 localhost python3.9[195842]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27514 DF PROTO=TCP SPT=60458 DPT=9102 SEQ=2990301430 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB1E200000000001030307) Nov 23 04:24:49 localhost python3.9[195955]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 23 04:24:52 localhost python3.9[196068]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:52 localhost python3.9[196178]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:53 localhost python3.9[196288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59770 DF PROTO=TCP SPT=42568 DPT=9102 SEQ=694892868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB33A10000000001030307) Nov 23 04:24:53 localhost python3.9[196398]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:54 localhost python3.9[196508]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:55 localhost python3.9[196618]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:24:56 localhost python3.9[196728]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:24:56 localhost python3.9[196818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889895.512462-1644-192067208845909/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:24:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63866 DF PROTO=TCP SPT=40028 DPT=9882 SEQ=4134728545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB42200000000001030307) Nov 23 04:24:57 localhost python3.9[196928]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:24:58 localhost python3.9[197018]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889897.0026946-1644-165844142534658/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:24:58 localhost python3.9[197128]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:24:59 localhost python3.9[197218]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889898.2616239-1644-132602676044445/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:24:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54055 DF PROTO=TCP SPT=34536 DPT=9101 SEQ=3284786878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB4C8A0000000001030307) Nov 23 04:24:59 localhost python3.9[197328]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:24:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45780 DF PROTO=TCP SPT=48102 DPT=9105 SEQ=4169266320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB4D3B0000000001030307) Nov 23 04:25:01 localhost python3.9[197418]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889899.4260101-1644-180176614613347/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:02 localhost python3.9[197528]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54057 DF PROTO=TCP SPT=34536 DPT=9101 SEQ=3284786878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB58A00000000001030307) Nov 23 04:25:03 localhost python3.9[197618]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889901.5670555-1644-13305777703161/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:03 localhost python3.9[197728]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:04 localhost python3.9[197818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889903.4291651-1644-231606674667162/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:05 localhost python3.9[197928]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:06 localhost python3.9[198016]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889905.0879934-1644-129158892957846/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28880 DF PROTO=TCP SPT=55906 DPT=9100 SEQ=14938984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB66210000000001030307) Nov 23 04:25:06 localhost python3.9[198126]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:07 localhost python3.9[198216]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763889906.1396778-1644-139627550053092/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:07 localhost python3.9[198326]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:08 localhost python3.9[198436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:25:09 localhost systemd[1]: tmp-crun.cpQIjq.mount: Deactivated successfully. Nov 23 04:25:09 localhost podman[198508]: 2025-11-23 09:25:09.045121762 +0000 UTC m=+0.098267469 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:25:09 localhost podman[198508]: 2025-11-23 09:25:09.114313889 +0000 UTC m=+0.167459596 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:25:09 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:25:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:25:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:25:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:25:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:25:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:25:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48974 DF PROTO=TCP SPT=35424 DPT=9100 SEQ=3493553969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB71600000000001030307) Nov 23 04:25:09 localhost python3.9[198567]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:09 localhost python3.9[198682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:10 localhost python3.9[198792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:11 localhost python3.9[198902]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:11 localhost python3.9[199012]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64472 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB7BB70000000001030307) Nov 23 04:25:12 localhost python3.9[199122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:25:13 localhost systemd[1]: tmp-crun.BUgRhf.mount: Deactivated successfully. Nov 23 04:25:13 localhost podman[199233]: 2025-11-23 09:25:13.022073822 +0000 UTC m=+0.081892956 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible) Nov 23 04:25:13 localhost podman[199233]: 2025-11-23 09:25:13.031322218 +0000 UTC m=+0.091141342 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 04:25:13 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:25:13 localhost python3.9[199232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:13 localhost python3.9[199359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:14 localhost python3.9[199469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:14 localhost python3.9[199579]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64474 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB87A00000000001030307) Nov 23 04:25:15 localhost python3.9[199689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:15 localhost python3.9[199799]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:16 localhost python3.9[199909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59772 DF PROTO=TCP SPT=42568 DPT=9102 SEQ=694892868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CB94200000000001030307) Nov 23 04:25:18 localhost python3.9[200019]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:19 localhost python3.9[200107]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889918.3684776-2307-174594629548743/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:20 localhost python3.9[200217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:21 localhost python3.9[200305]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889919.5529187-2307-91126729994897/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:21 localhost python3.9[200415]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:22 localhost python3.9[200503]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889921.321161-2307-96917649562328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:22 localhost python3.9[200613]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43889 DF PROTO=TCP SPT=35166 DPT=9102 SEQ=1640571255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBA8E00000000001030307) Nov 23 04:25:23 localhost python3.9[200701]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889922.4447055-2307-86066353550641/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:24 localhost python3.9[200811]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:24 localhost python3.9[200899]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889923.5928783-2307-53087487242397/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:25 localhost python3.9[201009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:25 localhost python3.9[201097]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889924.782969-2307-79215386343755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:26 localhost python3.9[201207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:26 localhost python3.9[201295]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889925.9307976-2307-23444864604633/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64476 DF PROTO=TCP SPT=52564 DPT=9882 SEQ=2627936758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBB8200000000001030307) Nov 23 04:25:27 localhost python3.9[201405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:28 localhost python3.9[201493]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889927.045695-2307-154630365563603/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:28 localhost python3.9[201603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:29 localhost python3.9[201691]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889928.3427062-2307-201849953653328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23741 DF PROTO=TCP SPT=46510 DPT=9101 SEQ=215047230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBC1BA0000000001030307) Nov 23 04:25:29 localhost python3.9[201801]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59945 DF PROTO=TCP SPT=41060 DPT=9105 SEQ=469432071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBC26A0000000001030307) Nov 23 04:25:30 localhost python3.9[201889]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889929.4993634-2307-270798856805065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:31 localhost python3.9[201999]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:32 localhost python3.9[202087]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889930.6268172-2307-92035410105332/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:32 localhost python3.9[202197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23743 DF PROTO=TCP SPT=46510 DPT=9101 SEQ=215047230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBCDE00000000001030307) Nov 23 04:25:33 localhost python3.9[202285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889932.2339375-2307-142074041105601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:34 localhost python3.9[202395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:34 localhost python3.9[202483]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889933.9871204-2307-172036676834250/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:35 localhost python3.9[202593]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:36 localhost python3.9[202681]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889935.0426493-2307-17196693756386/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61664 DF PROTO=TCP SPT=44452 DPT=9100 SEQ=3680421593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBDA210000000001030307) Nov 23 04:25:36 localhost python3.9[202881]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:25:37 localhost python3.9[203028]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 23 04:25:38 localhost python3.9[203156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:25:38 localhost systemd[1]: Reloading. Nov 23 04:25:38 localhost systemd-sysv-generator[203183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:25:38 localhost systemd-rc-local-generator[203179]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:25:39 localhost systemd[1]: Starting libvirt logging daemon socket... Nov 23 04:25:39 localhost systemd[1]: Listening on libvirt logging daemon socket. Nov 23 04:25:39 localhost systemd[1]: Starting libvirt logging daemon admin socket... Nov 23 04:25:39 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Nov 23 04:25:39 localhost systemd[1]: Starting libvirt logging daemon... Nov 23 04:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17159 DF PROTO=TCP SPT=40974 DPT=9100 SEQ=2781183349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBE6A00000000001030307) Nov 23 04:25:39 localhost podman[203194]: 2025-11-23 09:25:39.29378761 +0000 UTC m=+0.101478542 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:25:39 localhost systemd[1]: Started libvirt logging daemon. Nov 23 04:25:39 localhost podman[203194]: 2025-11-23 09:25:39.401304514 +0000 UTC m=+0.208995456 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:25:39 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:25:40 localhost python3.9[203333]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:25:40 localhost systemd[1]: Reloading. Nov 23 04:25:40 localhost systemd-sysv-generator[203357]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:25:40 localhost systemd-rc-local-generator[203354]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:40 localhost systemd[1]: Starting libvirt nodedev daemon socket... Nov 23 04:25:40 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Nov 23 04:25:40 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Nov 23 04:25:40 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Nov 23 04:25:40 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Nov 23 04:25:40 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Nov 23 04:25:40 localhost systemd[1]: Started libvirt nodedev daemon. Nov 23 04:25:41 localhost python3.9[203508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:25:41 localhost systemd[1]: Reloading. Nov 23 04:25:41 localhost systemd-rc-local-generator[203534]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:25:41 localhost systemd-sysv-generator[203539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:41 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 23 04:25:41 localhost systemd[1]: Starting libvirt proxy daemon socket... Nov 23 04:25:41 localhost systemd[1]: Listening on libvirt proxy daemon socket. Nov 23 04:25:41 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Nov 23 04:25:41 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Nov 23 04:25:41 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Nov 23 04:25:41 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Nov 23 04:25:41 localhost systemd[1]: Started libvirt proxy daemon. Nov 23 04:25:41 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 23 04:25:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17078 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBF0E60000000001030307) Nov 23 04:25:41 localhost setroubleshoot[203546]: Deleting alert a701375f-2314-41e7-b9ff-3bc5dfd0e157, it is allowed in current policy Nov 23 04:25:42 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Nov 23 04:25:42 localhost python3.9[203687]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:25:42 localhost systemd[1]: Reloading. Nov 23 04:25:42 localhost systemd-rc-local-generator[203711]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:25:42 localhost systemd-sysv-generator[203717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:42 localhost systemd[1]: Listening on libvirt locking daemon socket. Nov 23 04:25:42 localhost systemd[1]: Starting libvirt QEMU daemon socket... Nov 23 04:25:42 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Nov 23 04:25:42 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Nov 23 04:25:42 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Nov 23 04:25:42 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Nov 23 04:25:42 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Nov 23 04:25:42 localhost systemd[1]: Started libvirt QEMU daemon. Nov 23 04:25:42 localhost setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14acc193-8e99-4b7b-bc7e-04795d26bb69 Nov 23 04:25:42 localhost setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 23 04:25:42 localhost setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 14acc193-8e99-4b7b-bc7e-04795d26bb69 Nov 23 04:25:42 localhost setroubleshoot[203546]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 23 04:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:25:43 localhost systemd[1]: tmp-crun.HITXMo.mount: Deactivated successfully. Nov 23 04:25:43 localhost podman[203854]: 2025-11-23 09:25:43.156842342 +0000 UTC m=+0.079263001 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:25:43 localhost podman[203854]: 2025-11-23 09:25:43.166308104 +0000 UTC m=+0.088728753 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:25:43 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:25:43 localhost python3.9[203879]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:25:43 localhost systemd[1]: Reloading. Nov 23 04:25:43 localhost systemd-rc-local-generator[203919]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:25:43 localhost systemd-sysv-generator[203924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:25:43 localhost systemd[1]: Starting libvirt secret daemon socket... Nov 23 04:25:43 localhost systemd[1]: Listening on libvirt secret daemon socket. Nov 23 04:25:43 localhost systemd[1]: Starting libvirt secret daemon admin socket... Nov 23 04:25:43 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Nov 23 04:25:43 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Nov 23 04:25:43 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Nov 23 04:25:43 localhost systemd[1]: Started libvirt secret daemon. Nov 23 04:25:44 localhost python3.9[204073]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17080 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CBFCE10000000001030307) Nov 23 04:25:45 localhost python3.9[204183]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:25:46 localhost python3.9[204293]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:25:47 localhost python3.9[204405]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:25:48 localhost python3.9[204513]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43891 DF PROTO=TCP SPT=35166 DPT=9102 SEQ=1640571255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC0A210000000001030307) Nov 23 04:25:48 localhost python3.9[204599]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889947.6084235-3171-134549237374095/.source.xml follow=False _original_basename=secret.xml.j2 checksum=08854374a51612ae60ccb5be5d56c7ff5bc71f08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:49 localhost python3.9[204709]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 46550e70-79cb-5f55-bf6d-1204b97e083b#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:25:50 localhost python3.9[204829]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:52 localhost python3.9[205166]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:52 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Nov 23 04:25:52 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 23 04:25:53 localhost python3.9[205276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43572 DF PROTO=TCP SPT=43778 DPT=9102 SEQ=609621929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC1DE00000000001030307) Nov 23 04:25:53 localhost python3.9[205364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889952.9015393-3336-138180641096976/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:54 localhost python3.9[205474]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:55 localhost python3.9[205584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:55 localhost python3.9[205641]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:56 localhost python3.9[205751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17082 DF PROTO=TCP SPT=48576 DPT=9882 SEQ=27232340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC2C210000000001030307) Nov 23 04:25:57 localhost python3.9[205808]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._3f6l7jn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:57 localhost python3.9[205918]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:25:58 localhost python3.9[205975]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:25:58 localhost python3.9[206085]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:25:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3152 DF PROTO=TCP SPT=45864 DPT=9101 SEQ=2081057054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC36E90000000001030307) Nov 23 04:25:59 localhost python3[206196]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 23 04:25:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62158 DF PROTO=TCP SPT=33372 DPT=9105 SEQ=1694433680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC379B0000000001030307) Nov 23 04:26:00 localhost python3.9[206306]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:01 localhost python3.9[206363]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:01 localhost python3.9[206473]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:02 localhost python3.9[206530]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:02 localhost python3.9[206640]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62160 DF PROTO=TCP SPT=33372 DPT=9105 SEQ=1694433680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC43A10000000001030307) Nov 23 04:26:03 localhost python3.9[206697]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:04 localhost python3.9[206807]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:05 localhost python3.9[206864]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48977 DF PROTO=TCP SPT=35424 DPT=9100 SEQ=3493553969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC50210000000001030307) Nov 23 04:26:06 localhost python3.9[206974]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:07 localhost python3.9[207064]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763889965.6177292-3711-208855973769798/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:07 localhost python3.9[207174]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:08 localhost python3.9[207284]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:26:09.235 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:26:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:26:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:26:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:26:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59666 DF PROTO=TCP SPT=55348 DPT=9100 SEQ=4199510865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC5BE00000000001030307) Nov 23 04:26:09 localhost python3.9[207397]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:26:10 localhost podman[207472]: 2025-11-23 09:26:10.040268997 +0000 UTC m=+0.092951941 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:26:10 localhost podman[207472]: 2025-11-23 09:26:10.105620608 +0000 UTC m=+0.158303582 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true) Nov 23 04:26:10 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:26:10 localhost python3.9[207525]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:11 localhost python3.9[207643]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:26:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44119 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC66170000000001030307) Nov 23 04:26:11 localhost python3.9[207755]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:12 localhost python3.9[207868]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:13 localhost python3.9[207978]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:26:13 localhost podman[208067]: 2025-11-23 09:26:13.657618691 +0000 UTC m=+0.063867016 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:26:13 localhost podman[208067]: 2025-11-23 09:26:13.666244207 +0000 UTC m=+0.072492552 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:26:13 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:26:13 localhost python3.9[208066]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889972.8976505-3928-100005418120253/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:14 localhost python3.9[208194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44121 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC72200000000001030307) Nov 23 04:26:15 localhost python3.9[208282]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889974.023101-3972-166839050151418/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:15 localhost python3.9[208392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:26:16 localhost python3.9[208480]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763889975.1724162-4017-39600808219720/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43574 DF PROTO=TCP SPT=43778 DPT=9102 SEQ=609621929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC7E200000000001030307) Nov 23 04:26:18 localhost python3.9[208590]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:26:18 localhost systemd[1]: Reloading. Nov 23 04:26:18 localhost systemd-rc-local-generator[208617]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:26:18 localhost systemd-sysv-generator[208620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:18 localhost systemd[1]: Reached target edpm_libvirt.target. Nov 23 04:26:19 localhost python3.9[208740]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 23 04:26:19 localhost systemd[1]: Reloading. Nov 23 04:26:20 localhost systemd-rc-local-generator[208764]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:26:20 localhost systemd-sysv-generator[208768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: Reloading. Nov 23 04:26:20 localhost systemd-rc-local-generator[208802]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:26:20 localhost systemd-sysv-generator[208808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:21 localhost systemd[1]: session-52.scope: Deactivated successfully. Nov 23 04:26:21 localhost systemd[1]: session-52.scope: Consumed 3min 36.058s CPU time. Nov 23 04:26:21 localhost systemd-logind[761]: Session 52 logged out. Waiting for processes to exit. Nov 23 04:26:21 localhost systemd-logind[761]: Removed session 52. Nov 23 04:26:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17754 DF PROTO=TCP SPT=57284 DPT=9102 SEQ=62392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CC93200000000001030307) Nov 23 04:26:26 localhost sshd[208831]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:26:26 localhost systemd-logind[761]: New session 53 of user zuul. Nov 23 04:26:26 localhost systemd[1]: Started Session 53 of User zuul. Nov 23 04:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44123 DF PROTO=TCP SPT=41746 DPT=9882 SEQ=3913687544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCA2200000000001030307) Nov 23 04:26:27 localhost python3.9[208942]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:26:28 localhost python3.9[209054]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:26:29 localhost network[209071]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:26:29 localhost network[209072]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:26:29 localhost network[209073]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:26:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13885 DF PROTO=TCP SPT=35964 DPT=9101 SEQ=104125501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCAC1A0000000001030307) Nov 23 04:26:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3582 DF PROTO=TCP SPT=53272 DPT=9105 SEQ=437809189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCACCB0000000001030307) Nov 23 04:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13887 DF PROTO=TCP SPT=35964 DPT=9101 SEQ=104125501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCB8200000000001030307) Nov 23 04:26:34 localhost python3.9[209305]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:26:35 localhost python3.9[209368]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:26:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17162 DF PROTO=TCP SPT=40974 DPT=9100 SEQ=2781183349 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCC4200000000001030307) Nov 23 04:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53967 DF PROTO=TCP SPT=34390 DPT=9100 SEQ=2157483921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCD1200000000001030307) Nov 23 04:26:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:26:41 localhost podman[209457]: 2025-11-23 09:26:41.03201156 +0000 UTC m=+0.085671467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 23 04:26:41 localhost podman[209457]: 2025-11-23 09:26:41.087486568 +0000 UTC m=+0.141146505 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:26:41 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:26:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5687 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCDB470000000001030307) Nov 23 04:26:43 localhost python3.9[209592]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:26:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:26:44 localhost podman[209666]: 2025-11-23 09:26:44.023619477 +0000 UTC m=+0.075471703 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:26:44 localhost podman[209666]: 2025-11-23 09:26:44.052079483 +0000 UTC m=+0.103931679 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 23 04:26:44 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:26:44 localhost sshd[209723]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:26:44 localhost python3.9[209722]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5689 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCE7600000000001030307) Nov 23 04:26:45 localhost python3.9[209833]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:46 localhost python3.9[209944]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:47 localhost python3.9[210055]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:26:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17756 DF PROTO=TCP SPT=57284 DPT=9102 SEQ=62392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CCF4200000000001030307) Nov 23 04:26:48 localhost python3.9[210166]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:26:49 localhost python3.9[210278]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:26:50 localhost python3.9[210388]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:26:51 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Nov 23 04:26:51 localhost python3.9[210502]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:26:51 localhost systemd[1]: Reloading. Nov 23 04:26:51 localhost systemd-rc-local-generator[210533]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:26:51 localhost systemd-sysv-generator[210536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:26:52 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Nov 23 04:26:52 localhost systemd[1]: Starting Open-iSCSI... Nov 23 04:26:52 localhost iscsid[210544]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Nov 23 04:26:52 localhost iscsid[210544]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Nov 23 04:26:52 localhost iscsid[210544]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Nov 23 04:26:52 localhost iscsid[210544]: If using hardware iscsi like qla4xxx this message can be ignored. Nov 23 04:26:52 localhost iscsid[210544]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Nov 23 04:26:52 localhost iscsid[210544]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Nov 23 04:26:52 localhost iscsid[210544]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Nov 23 04:26:52 localhost systemd[1]: Started Open-iSCSI. Nov 23 04:26:52 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Nov 23 04:26:52 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Nov 23 04:26:53 localhost python3.9[210655]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:26:53 localhost network[210672]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:26:53 localhost network[210673]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:26:53 localhost network[210674]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:26:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42760 DF PROTO=TCP SPT=55756 DPT=9102 SEQ=2764835931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD08600000000001030307) Nov 23 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:26:54 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 23 04:26:55 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 23 04:26:55 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 22d5a650-a1e0-4691-a0c3-b98d96afcbd6 Nov 23 04:26:56 localhost setroubleshoot[210726]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 23 04:26:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5691 DF PROTO=TCP SPT=56114 DPT=9882 SEQ=4269180401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD18200000000001030307) Nov 23 04:26:59 localhost python3.9[210923]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 04:26:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12056 DF PROTO=TCP SPT=47370 DPT=9101 SEQ=2871435611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD21490000000001030307) Nov 23 04:26:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30764 DF PROTO=TCP SPT=35804 DPT=9105 SEQ=677915327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD21FB0000000001030307) Nov 23 04:27:00 localhost python3.9[211033]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 23 04:27:00 localhost python3.9[211147]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:01 localhost python3.9[211235]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890020.4672198-456-48508680459087/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:02 localhost python3.9[211345]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12058 DF PROTO=TCP SPT=47370 DPT=9101 SEQ=2871435611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD2D600000000001030307) Nov 23 04:27:03 localhost python3.9[211455]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:27:03 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 23 04:27:03 localhost systemd[1]: Stopped Load Kernel Modules. Nov 23 04:27:03 localhost systemd[1]: Stopping Load Kernel Modules... Nov 23 04:27:03 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 04:27:03 localhost systemd-modules-load[211459]: Module 'msr' is built in Nov 23 04:27:03 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 04:27:04 localhost python3.9[211569]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:05 localhost python3.9[211679]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:05 localhost python3.9[211789]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59669 DF PROTO=TCP SPT=55348 DPT=9100 SEQ=4199510865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD3A200000000001030307) Nov 23 04:27:06 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Nov 23 04:27:06 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Consumed 1.004s CPU time. Nov 23 04:27:06 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 23 04:27:06 localhost python3.9[211899]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:07 localhost python3.9[211987]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890026.1158402-630-233069566537776/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:07 localhost python3.9[212097]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:27:08 localhost python3.9[212208]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:27:09.236 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:27:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:27:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:27:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:27:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57724 DF PROTO=TCP SPT=55228 DPT=9100 SEQ=2277522031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD46200000000001030307) Nov 23 04:27:09 localhost python3.9[212318]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:10 localhost python3.9[212428]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:27:11 localhost systemd[1]: tmp-crun.G0QVP0.mount: Deactivated successfully. Nov 23 04:27:11 localhost podman[212539]: 2025-11-23 09:27:11.548076989 +0000 UTC m=+0.081987061 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller) Nov 23 04:27:11 localhost podman[212539]: 2025-11-23 09:27:11.584397514 +0000 UTC m=+0.118307646 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true) Nov 23 04:27:11 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:27:11 localhost python3.9[212538]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21638 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD50770000000001030307) Nov 23 04:27:12 localhost python3.9[212673]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:13 localhost python3.9[212783]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:13 localhost python3.9[212893]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:27:14 localhost podman[213004]: 2025-11-23 09:27:14.614761928 +0000 UTC m=+0.081252739 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 23 04:27:14 localhost podman[213004]: 2025-11-23 09:27:14.64929891 +0000 UTC m=+0.115789751 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:27:14 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:27:14 localhost python3.9[213003]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21640 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD5C600000000001030307) Nov 23 04:27:15 localhost python3.9[213133]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:16 localhost python3.9[213243]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:17 localhost python3.9[213353]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:17 localhost python3.9[213410]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42762 DF PROTO=TCP SPT=55756 DPT=9102 SEQ=2764835931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD68200000000001030307) Nov 23 04:27:18 localhost python3.9[213520]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:18 localhost python3.9[213577]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:19 localhost python3.9[213687]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:20 localhost python3.9[213797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:20 localhost python3.9[213854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:21 localhost python3.9[213964]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:22 localhost python3.9[214021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:22 localhost python3.9[214131]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:27:22 localhost systemd[1]: Reloading. Nov 23 04:27:22 localhost systemd-rc-local-generator[214153]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:27:22 localhost systemd-sysv-generator[214160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:22 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:23 localhost systemd[1]: Starting dnf makecache... Nov 23 04:27:23 localhost dnf[214169]: Updating Subscription Management repositories. Nov 23 04:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5839 DF PROTO=TCP SPT=55886 DPT=9102 SEQ=2200684048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD7DA10000000001030307) Nov 23 04:27:24 localhost python3.9[214280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:25 localhost dnf[214169]: Metadata cache refreshed recently. Nov 23 04:27:25 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 23 04:27:25 localhost systemd[1]: Finished dnf makecache. Nov 23 04:27:25 localhost systemd[1]: dnf-makecache.service: Consumed 1.990s CPU time. Nov 23 04:27:25 localhost python3.9[214337]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:25 localhost python3.9[214447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21642 DF PROTO=TCP SPT=39518 DPT=9882 SEQ=4167220684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD8C210000000001030307) Nov 23 04:27:27 localhost python3.9[214504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:28 localhost python3.9[214614]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:27:28 localhost systemd[1]: Reloading. Nov 23 04:27:28 localhost systemd-rc-local-generator[214641]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:27:28 localhost systemd-sysv-generator[214645]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:28 localhost systemd[1]: Starting Create netns directory... Nov 23 04:27:28 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:27:28 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:27:28 localhost systemd[1]: Finished Create netns directory. Nov 23 04:27:29 localhost python3.9[214766]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33997 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=2531834913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD96790000000001030307) Nov 23 04:27:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7485 DF PROTO=TCP SPT=35368 DPT=9105 SEQ=1242860296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CD972B0000000001030307) Nov 23 04:27:30 localhost python3.9[214876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:30 localhost python3.9[214964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890049.7123399-1251-87371241721467/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:31 localhost python3.9[215074]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:27:32 localhost python3.9[215184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33999 DF PROTO=TCP SPT=38656 DPT=9101 SEQ=2531834913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDA2A00000000001030307) Nov 23 04:27:33 localhost python3.9[215272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890051.9704373-1326-157418401051098/.source.json _original_basename=.vusu7ets follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:33 localhost python3.9[215382]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:36 localhost python3.9[215690]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 23 04:27:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53970 DF PROTO=TCP SPT=34390 DPT=9100 SEQ=2157483921 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDB0200000000001030307) Nov 23 04:27:37 localhost python3.9[215800]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:27:38 localhost python3.9[215910]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 04:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33817 DF PROTO=TCP SPT=50194 DPT=9100 SEQ=1007322871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDBB600000000001030307) Nov 23 04:27:40 localhost podman[216065]: 2025-11-23 09:27:40.506164507 +0000 UTC m=+0.095234557 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main) Nov 23 04:27:40 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Nov 23 04:27:40 localhost podman[216065]: 2025-11-23 09:27:40.621338548 +0000 UTC m=+0.210408588 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main) Nov 23 04:27:41 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Nov 23 04:27:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:27:41 localhost podman[216197]: 2025-11-23 09:27:41.820166251 +0000 UTC m=+0.095788964 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:27:41 localhost podman[216197]: 2025-11-23 09:27:41.862032353 +0000 UTC m=+0.137655125 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:27:41 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:27:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10891 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDC5A70000000001030307) Nov 23 04:27:42 localhost python3[216339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:27:44 localhost podman[216352]: 2025-11-23 09:27:43.134101664 +0000 UTC m=+0.048400747 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 23 04:27:44 localhost podman[216401]: Nov 23 04:27:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:27:44 localhost podman[216401]: 2025-11-23 09:27:44.946131531 +0000 UTC m=+0.087433193 container create 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:27:44 localhost podman[216401]: 2025-11-23 09:27:44.904692953 +0000 UTC m=+0.045994675 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 23 04:27:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10893 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDD1A00000000001030307) Nov 23 04:27:44 localhost python3[216339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 23 04:27:45 localhost podman[216414]: 2025-11-23 09:27:45.049383517 +0000 UTC m=+0.095268658 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:27:45 localhost podman[216414]: 2025-11-23 09:27:45.086267069 +0000 UTC m=+0.132152230 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 23 04:27:45 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:27:45 localhost python3.9[216564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:46 localhost python3.9[216676]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:47 localhost python3.9[216731]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:48 localhost python3.9[216840]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890067.4072886-1590-251256976916081/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5841 DF PROTO=TCP SPT=55886 DPT=9102 SEQ=2200684048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDDE200000000001030307) Nov 23 04:27:48 localhost python3.9[216895]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:27:48 localhost systemd[1]: Reloading. Nov 23 04:27:48 localhost systemd-sysv-generator[216923]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:27:48 localhost systemd-rc-local-generator[216918]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost python3.9[216986]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:27:49 localhost systemd[1]: Reloading. Nov 23 04:27:49 localhost systemd-sysv-generator[217018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:27:49 localhost systemd-rc-local-generator[217013]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:27:50 localhost systemd[1]: Starting multipathd container... Nov 23 04:27:50 localhost systemd[1]: Started libcrun container. Nov 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 04:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:27:50 localhost podman[217027]: 2025-11-23 09:27:50.19473036 +0000 UTC m=+0.164751265 container init 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:27:50 localhost multipathd[217040]: + sudo -E kolla_set_configs Nov 23 04:27:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:27:50 localhost podman[217027]: 2025-11-23 09:27:50.229534169 +0000 UTC m=+0.199555024 container start 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 04:27:50 localhost podman[217027]: multipathd Nov 23 04:27:50 localhost systemd[1]: Started multipathd container. Nov 23 04:27:50 localhost multipathd[217040]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:27:50 localhost multipathd[217040]: INFO:__main__:Validating config file Nov 23 04:27:50 localhost multipathd[217040]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:27:50 localhost multipathd[217040]: INFO:__main__:Writing out command to execute Nov 23 04:27:50 localhost multipathd[217040]: ++ cat /run_command Nov 23 04:27:50 localhost multipathd[217040]: + CMD='/usr/sbin/multipathd -d' Nov 23 04:27:50 localhost multipathd[217040]: + ARGS= Nov 23 04:27:50 localhost multipathd[217040]: + sudo kolla_copy_cacerts Nov 23 04:27:50 localhost multipathd[217040]: + [[ ! -n '' ]] Nov 23 04:27:50 localhost multipathd[217040]: + . kolla_extend_start Nov 23 04:27:50 localhost multipathd[217040]: Running command: '/usr/sbin/multipathd -d' Nov 23 04:27:50 localhost multipathd[217040]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 23 04:27:50 localhost multipathd[217040]: + umask 0022 Nov 23 04:27:50 localhost multipathd[217040]: + exec /usr/sbin/multipathd -d Nov 23 04:27:50 localhost multipathd[217040]: 10013.504916 | --------start up-------- Nov 23 04:27:50 localhost multipathd[217040]: 10013.504933 | read /etc/multipath.conf Nov 23 04:27:50 localhost podman[217048]: 2025-11-23 09:27:50.327120625 +0000 UTC m=+0.091617869 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:27:50 localhost multipathd[217040]: 10013.508976 | path checkers start up Nov 23 04:27:50 localhost podman[217048]: 2025-11-23 09:27:50.346206636 +0000 UTC m=+0.110703900 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:27:50 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:27:51 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 23 04:27:52 localhost python3.9[217187]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:27:53 localhost python3.9[217299]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9612 DF PROTO=TCP SPT=42272 DPT=9102 SEQ=4110621087 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CDF2A00000000001030307) Nov 23 04:27:54 localhost python3.9[217422]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:27:54 localhost systemd[1]: Stopping multipathd container... Nov 23 04:27:54 localhost multipathd[217040]: 10017.578704 | exit (signal) Nov 23 04:27:54 localhost multipathd[217040]: 10017.579219 | --------shut down------- Nov 23 04:27:54 localhost systemd[1]: libpod-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.scope: Deactivated successfully. Nov 23 04:27:54 localhost podman[217426]: 2025-11-23 09:27:54.43208744 +0000 UTC m=+0.097408261 container died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:27:54 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.timer: Deactivated successfully. Nov 23 04:27:54 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:27:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8-userdata-shm.mount: Deactivated successfully. Nov 23 04:27:54 localhost systemd[1]: var-lib-containers-storage-overlay-95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b-merged.mount: Deactivated successfully. Nov 23 04:27:54 localhost podman[217426]: 2025-11-23 09:27:54.637755546 +0000 UTC m=+0.303076297 container cleanup 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:27:54 localhost podman[217426]: multipathd Nov 23 04:27:54 localhost podman[217452]: 2025-11-23 09:27:54.734049363 +0000 UTC m=+0.064421596 container cleanup 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:27:54 localhost podman[217452]: multipathd Nov 23 04:27:54 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Nov 23 04:27:54 localhost systemd[1]: Stopped multipathd container. Nov 23 04:27:54 localhost systemd[1]: Starting multipathd container... Nov 23 04:27:54 localhost systemd[1]: Started libcrun container. Nov 23 04:27:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 23 04:27:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95e99bc6c4c18bf287eec2a5ea60475a16f0726f33709c1ae70c62b1ba99720b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 04:27:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:27:54 localhost podman[217464]: 2025-11-23 09:27:54.891633392 +0000 UTC m=+0.130277644 container init 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:27:54 localhost multipathd[217479]: + sudo -E kolla_set_configs Nov 23 04:27:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:27:54 localhost podman[217464]: 2025-11-23 09:27:54.926963348 +0000 UTC m=+0.165607550 container start 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Nov 23 04:27:54 localhost podman[217464]: multipathd Nov 23 04:27:54 localhost systemd[1]: Started multipathd container. Nov 23 04:27:54 localhost multipathd[217479]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:27:54 localhost multipathd[217479]: INFO:__main__:Validating config file Nov 23 04:27:54 localhost multipathd[217479]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:27:54 localhost multipathd[217479]: INFO:__main__:Writing out command to execute Nov 23 04:27:54 localhost multipathd[217479]: ++ cat /run_command Nov 23 04:27:54 localhost multipathd[217479]: + CMD='/usr/sbin/multipathd -d' Nov 23 04:27:54 localhost multipathd[217479]: + ARGS= Nov 23 04:27:54 localhost multipathd[217479]: + sudo kolla_copy_cacerts Nov 23 04:27:55 localhost multipathd[217479]: + [[ ! -n '' ]] Nov 23 04:27:55 localhost multipathd[217479]: + . kolla_extend_start Nov 23 04:27:55 localhost multipathd[217479]: Running command: '/usr/sbin/multipathd -d' Nov 23 04:27:55 localhost multipathd[217479]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 23 04:27:55 localhost multipathd[217479]: + umask 0022 Nov 23 04:27:55 localhost multipathd[217479]: + exec /usr/sbin/multipathd -d Nov 23 04:27:55 localhost podman[217487]: 2025-11-23 09:27:55.017665038 +0000 UTC m=+0.083833016 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:27:55 localhost multipathd[217479]: 10018.196342 | --------start up-------- Nov 23 04:27:55 localhost multipathd[217479]: 10018.196359 | read /etc/multipath.conf Nov 23 04:27:55 localhost multipathd[217479]: 10018.200030 | path checkers start up Nov 23 04:27:55 localhost podman[217487]: 2025-11-23 09:27:55.028577944 +0000 UTC m=+0.094745922 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible) Nov 23 04:27:55 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:27:56 localhost python3.9[217627]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10895 DF PROTO=TCP SPT=55340 DPT=9882 SEQ=773630041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE02200000000001030307) Nov 23 04:27:57 localhost python3.9[217737]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 04:27:58 localhost python3.9[217847]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 23 04:27:58 localhost python3.9[217965]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:27:59 localhost python3.9[218053]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890078.4519227-1830-235798883936037/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:27:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2307 DF PROTO=TCP SPT=33386 DPT=9101 SEQ=2251829978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE0BAA0000000001030307) Nov 23 04:27:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30949 DF PROTO=TCP SPT=58386 DPT=9105 SEQ=1378184660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE0C5D0000000001030307) Nov 23 04:28:00 localhost python3.9[218163]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:01 localhost python3.9[218273]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:28:01 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 23 04:28:01 localhost systemd[1]: Stopped Load Kernel Modules. Nov 23 04:28:01 localhost systemd[1]: Stopping Load Kernel Modules... Nov 23 04:28:01 localhost systemd[1]: Starting Load Kernel Modules... Nov 23 04:28:01 localhost systemd-modules-load[218277]: Module 'msr' is built in Nov 23 04:28:01 localhost systemd[1]: Finished Load Kernel Modules. Nov 23 04:28:02 localhost python3.9[218387]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:28:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30951 DF PROTO=TCP SPT=58386 DPT=9105 SEQ=1378184660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE18600000000001030307) Nov 23 04:28:04 localhost sshd[218390]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:28:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57727 DF PROTO=TCP SPT=55228 DPT=9100 SEQ=2277522031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE24200000000001030307) Nov 23 04:28:06 localhost systemd[1]: Reloading. Nov 23 04:28:06 localhost systemd-rc-local-generator[218423]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:28:06 localhost systemd-sysv-generator[218427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: Reloading. Nov 23 04:28:06 localhost systemd-rc-local-generator[218459]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:28:06 localhost systemd-sysv-generator[218464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button) Nov 23 04:28:07 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 23 04:28:07 localhost lvm[218510]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 23 04:28:07 localhost lvm[218510]: VG ceph_vg1 finished Nov 23 04:28:07 localhost lvm[218509]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 23 04:28:07 localhost lvm[218509]: VG ceph_vg0 finished Nov 23 04:28:07 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 23 04:28:07 localhost systemd[1]: Starting man-db-cache-update.service... Nov 23 04:28:07 localhost systemd[1]: Reloading. Nov 23 04:28:07 localhost systemd-sysv-generator[218567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:28:07 localhost systemd-rc-local-generator[218562]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:07 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 23 04:28:08 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 23 04:28:08 localhost systemd[1]: Finished man-db-cache-update.service. Nov 23 04:28:08 localhost systemd[1]: man-db-cache-update.service: Consumed 1.270s CPU time. Nov 23 04:28:08 localhost systemd[1]: run-r03827389933a4623bedccab4fa957ae7.service: Deactivated successfully. Nov 23 04:28:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:28:09.237 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:28:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:28:09.238 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:28:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:28:09.240 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:28:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43973 DF PROTO=TCP SPT=50680 DPT=9100 SEQ=2399587717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE30A00000000001030307) Nov 23 04:28:09 localhost python3.9[219806]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:28:10 localhost python3.9[219920]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:11 localhost python3.9[220030]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:28:11 localhost systemd[1]: Reloading. Nov 23 04:28:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42499 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE3AD60000000001030307) Nov 23 04:28:11 localhost systemd-sysv-generator[220061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:28:11 localhost systemd-rc-local-generator[220053]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:28:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:28:12 localhost podman[220067]: 2025-11-23 09:28:12.246977305 +0000 UTC m=+0.083405560 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:28:12 localhost podman[220067]: 2025-11-23 09:28:12.284267357 +0000 UTC m=+0.120695652 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:28:12 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:28:13 localhost python3.9[220199]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:28:13 localhost network[220216]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:28:13 localhost network[220217]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:28:13 localhost network[220218]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:28:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42501 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE46E00000000001030307) Nov 23 04:28:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:28:15 localhost podman[220246]: 2025-11-23 09:28:15.213862017 +0000 UTC m=+0.079571384 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:28:15 localhost podman[220246]: 2025-11-23 09:28:15.222477025 +0000 UTC m=+0.088186402 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:28:15 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:28:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:18 localhost python3.9[220471]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:18 localhost python3.9[220582]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42502 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE56A00000000001030307) Nov 23 04:28:19 localhost python3.9[220693]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:20 localhost python3.9[220804]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:22 localhost python3.9[220915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:22 localhost python3.9[221026]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16865 DF PROTO=TCP SPT=51904 DPT=9102 SEQ=718674353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE67E00000000001030307) Nov 23 04:28:23 localhost python3.9[221137]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:24 localhost python3.9[221248]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:28:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:28:26 localhost podman[221267]: 2025-11-23 09:28:26.026126209 +0000 UTC m=+0.082504464 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:28:26 localhost podman[221267]: 2025-11-23 09:28:26.037759631 +0000 UTC m=+0.094137896 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:28:26 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:28:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42503 DF PROTO=TCP SPT=43148 DPT=9882 SEQ=2531647507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE76200000000001030307) Nov 23 04:28:28 localhost python3.9[221377]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:28 localhost python3.9[221487]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:29 localhost python3.9[221597]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3102 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE80DA0000000001030307) Nov 23 04:28:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37647 DF PROTO=TCP SPT=59014 DPT=9105 SEQ=901764965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE818B0000000001030307) Nov 23 04:28:30 localhost python3.9[221707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:30 localhost python3.9[221817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:31 localhost python3.9[221927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:32 localhost python3.9[222037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:32 localhost python3.9[222147]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3104 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE8CE00000000001030307) Nov 23 04:28:33 localhost python3.9[222257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:33 localhost python3.9[222367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:34 localhost python3.9[222477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:35 localhost python3.9[222587]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:35 localhost python3.9[222697]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:36 localhost python3.9[222807]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33820 DF PROTO=TCP SPT=50194 DPT=9100 SEQ=1007322871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CE9A200000000001030307) Nov 23 04:28:36 localhost python3.9[222917]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:37 localhost python3.9[223027]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:28:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20598 DF PROTO=TCP SPT=45392 DPT=9100 SEQ=4141493920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEA5E00000000001030307) Nov 23 04:28:40 localhost python3.9[223137]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:41 localhost python3.9[223247]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:28:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19232 DF PROTO=TCP SPT=43124 DPT=9882 SEQ=1417323512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEB0070000000001030307) Nov 23 04:28:42 localhost python3.9[223357]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:28:42 localhost systemd[1]: Reloading. Nov 23 04:28:42 localhost systemd-rc-local-generator[223378]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:28:42 localhost systemd-sysv-generator[223385]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:28:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:28:42 localhost systemd[1]: tmp-crun.6kHDsX.mount: Deactivated successfully. Nov 23 04:28:42 localhost podman[223410]: 2025-11-23 09:28:42.504335812 +0000 UTC m=+0.092688307 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:28:42 localhost podman[223410]: 2025-11-23 09:28:42.547330112 +0000 UTC m=+0.135682657 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:28:42 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:28:43 localhost python3.9[223582]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:43 localhost python3.9[223707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:44 localhost python3.9[223836]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:44 localhost python3.9[223947]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3106 DF PROTO=TCP SPT=47166 DPT=9101 SEQ=769331823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEBC200000000001030307) Nov 23 04:28:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:28:46 localhost podman[223949]: 2025-11-23 09:28:46.033121225 +0000 UTC m=+0.087739279 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:28:46 localhost podman[223949]: 2025-11-23 09:28:46.063023403 +0000 UTC m=+0.117641477 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 04:28:46 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:28:46 localhost python3.9[224076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:47 localhost python3.9[224187]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:48 localhost python3.9[224298]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16867 DF PROTO=TCP SPT=51904 DPT=9102 SEQ=718674353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEC8200000000001030307) Nov 23 04:28:48 localhost python3.9[224409]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:28:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40351 DF PROTO=TCP SPT=40044 DPT=9102 SEQ=4073162484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEDD200000000001030307) Nov 23 04:28:53 localhost python3.9[224520]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:54 localhost python3.9[224630]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:54 localhost python3.9[224740]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:55 localhost python3.9[224850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:28:56 localhost systemd[1]: tmp-crun.RYZ1rj.mount: Deactivated successfully. Nov 23 04:28:56 localhost podman[224961]: 2025-11-23 09:28:56.222210299 +0000 UTC m=+0.099078972 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:28:56 localhost podman[224961]: 2025-11-23 09:28:56.233390459 +0000 UTC m=+0.110259112 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3) Nov 23 04:28:56 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:28:56 localhost python3.9[224960]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:56 localhost python3.9[225087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19236 DF PROTO=TCP SPT=43124 DPT=9882 SEQ=1417323512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEEC200000000001030307) Nov 23 04:28:57 localhost python3.9[225197]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:58 localhost python3.9[225307]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:58 localhost python3.9[225417]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:59 localhost python3.9[225527]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:28:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38985 DF PROTO=TCP SPT=50238 DPT=9101 SEQ=1719451716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEF60A0000000001030307) Nov 23 04:28:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49821 DF PROTO=TCP SPT=34514 DPT=9105 SEQ=2692827760 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CEF6BB0000000001030307) Nov 23 04:29:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38987 DF PROTO=TCP SPT=50238 DPT=9101 SEQ=1719451716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF02200000000001030307) Nov 23 04:29:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43976 DF PROTO=TCP SPT=50680 DPT=9100 SEQ=2399587717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF0E200000000001030307) Nov 23 04:29:07 localhost python3.9[225637]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 23 04:29:08 localhost python3.9[225748]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 23 04:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40928 DF PROTO=TCP SPT=47280 DPT=9100 SEQ=2376654661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF1AE00000000001030307) Nov 23 04:29:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:29:09.239 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:29:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:29:09.239 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:29:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:29:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:29:10 localhost python3.9[225864]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 23 04:29:11 localhost sshd[225890]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:29:11 localhost systemd-logind[761]: New session 54 of user zuul. Nov 23 04:29:11 localhost systemd[1]: Started Session 54 of User zuul. Nov 23 04:29:11 localhost systemd[1]: session-54.scope: Deactivated successfully. Nov 23 04:29:11 localhost systemd-logind[761]: Session 54 logged out. Waiting for processes to exit. Nov 23 04:29:11 localhost systemd-logind[761]: Removed session 54. Nov 23 04:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49350 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF25360000000001030307) Nov 23 04:29:12 localhost python3.9[226001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:12 localhost python3.9[226087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890151.75823-3389-149144960786998/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:29:13 localhost podman[226124]: 2025-11-23 09:29:13.043121017 +0000 UTC m=+0.098440831 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 23 04:29:13 localhost podman[226124]: 2025-11-23 09:29:13.125669444 +0000 UTC m=+0.180989308 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller) Nov 23 04:29:13 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:29:13 localhost python3.9[226220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:13 localhost python3.9[226275]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:14 localhost python3.9[226383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:14 localhost python3.9[226469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890153.9351525-3389-205843061096348/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49352 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF31210000000001030307) Nov 23 04:29:15 localhost python3.9[226577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:15 localhost python3.9[226663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890154.9653418-3389-60953597794694/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=c48862f04c3bb6bb101bc9efe68e434d3f83ed7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:29:17 localhost python3.9[226771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:17 localhost systemd[1]: tmp-crun.doBBTs.mount: Deactivated successfully. Nov 23 04:29:17 localhost podman[226772]: 2025-11-23 09:29:17.028359478 +0000 UTC m=+0.083400114 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:29:17 localhost podman[226772]: 2025-11-23 09:29:17.063246668 +0000 UTC m=+0.118287304 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:29:17 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:29:17 localhost python3.9[226874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890156.596046-3389-236023126748895/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:18 localhost python3.9[226982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40353 DF PROTO=TCP SPT=40044 DPT=9102 SEQ=4073162484 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF3E200000000001030307) Nov 23 04:29:18 localhost python3.9[227068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890157.6420708-3389-241080197897321/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:20 localhost python3.9[227178]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:29:21 localhost python3.9[227288]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:29:22 localhost python3.9[227398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:22 localhost python3.9[227510]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16594 DF PROTO=TCP SPT=60978 DPT=9102 SEQ=430266586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF52610000000001030307) Nov 23 04:29:23 localhost python3.9[227618]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:24 localhost python3.9[227728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:24 localhost python3.9[227814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890163.9385962-3765-74460258997354/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:25 localhost python3.9[227922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:29:26 localhost python3.9[228008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890165.1541164-3810-251059714091482/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:29:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:29:26 localhost systemd[1]: tmp-crun.nNphTj.mount: Deactivated successfully. Nov 23 04:29:26 localhost podman[228119]: 2025-11-23 09:29:26.981468123 +0000 UTC m=+0.098671727 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:29:26 localhost podman[228119]: 2025-11-23 09:29:26.991327453 +0000 UTC m=+0.108531097 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:29:27 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:29:27 localhost python3.9[228118]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 23 04:29:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49354 DF PROTO=TCP SPT=36458 DPT=9882 SEQ=3389845660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF62200000000001030307) Nov 23 04:29:27 localhost python3.9[228248]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:29:28 localhost python3[228358]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:29:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16982 DF PROTO=TCP SPT=47048 DPT=9101 SEQ=15123556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF6B3A0000000001030307) Nov 23 04:29:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40984 DF PROTO=TCP SPT=38618 DPT=9105 SEQ=695840909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF6BEB0000000001030307) Nov 23 04:29:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16984 DF PROTO=TCP SPT=47048 DPT=9101 SEQ=15123556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF77610000000001030307) Nov 23 04:29:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20601 DF PROTO=TCP SPT=45392 DPT=9100 SEQ=4141493920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF84200000000001030307) Nov 23 04:29:39 localhost podman[228372]: 2025-11-23 09:29:28.9287112 +0000 UTC m=+0.036686795 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 23 04:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31310 DF PROTO=TCP SPT=51714 DPT=9100 SEQ=3062298622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF90200000000001030307) Nov 23 04:29:39 localhost podman[228435]: Nov 23 04:29:39 localhost podman[228435]: 2025-11-23 09:29:39.322349078 +0000 UTC m=+0.073234107 container create 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.vendor=CentOS) Nov 23 04:29:39 localhost podman[228435]: 2025-11-23 09:29:39.281137865 +0000 UTC m=+0.032022944 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 23 04:29:39 localhost python3[228358]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Nov 23 04:29:40 localhost python3.9[228582]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9376 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CF9A7C0000000001030307) Nov 23 04:29:42 localhost python3.9[228694]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 23 04:29:42 localhost python3.9[228804]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:29:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:29:43 localhost podman[228915]: 2025-11-23 09:29:43.636698085 +0000 UTC m=+0.090518801 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:29:43 localhost podman[228915]: 2025-11-23 09:29:43.706806795 +0000 UTC m=+0.160627461 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller) Nov 23 04:29:43 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:29:43 localhost python3[228914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:29:44 localhost python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 23 04:29:44 localhost podman[229021]: 2025-11-23 09:29:44.17644785 +0000 UTC m=+0.089623474 container remove e42bf66df001117ddbe2a665cb386dd8f8457a11498878e8e6635cc6af4dd2ce (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '67452ffc3d9e727585009ffc9989a224-39370c45b6a27bfda1ebe1fb9d328c43'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 23 04:29:44 localhost python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Nov 23 04:29:44 localhost podman[229037]: Nov 23 04:29:44 localhost podman[229037]: 2025-11-23 09:29:44.28444477 +0000 UTC m=+0.086201439 container create 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:29:44 localhost podman[229037]: 2025-11-23 09:29:44.245290101 +0000 UTC m=+0.047046800 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 23 04:29:44 localhost python3[228914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Nov 23 04:29:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9378 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFA6A00000000001030307) Nov 23 04:29:45 localhost python3.9[229215]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:45 localhost python3.9[229345]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:29:46 localhost python3.9[229454]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890186.3920426-4085-269267773013087/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:29:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:29:47 localhost podman[229510]: 2025-11-23 09:29:47.293794669 +0000 UTC m=+0.085881710 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 23 04:29:47 localhost podman[229510]: 2025-11-23 09:29:47.327286936 +0000 UTC m=+0.119373957 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:29:47 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:29:47 localhost python3.9[229509]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:29:47 localhost systemd[1]: Reloading. Nov 23 04:29:47 localhost systemd-sysv-generator[229551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:29:47 localhost systemd-rc-local-generator[229547]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16596 DF PROTO=TCP SPT=60978 DPT=9102 SEQ=430266586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFB2200000000001030307) Nov 23 04:29:49 localhost python3.9[229672]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:29:49 localhost systemd[1]: Reloading. Nov 23 04:29:49 localhost systemd-rc-local-generator[229698]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:29:49 localhost systemd-sysv-generator[229703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:29:49 localhost systemd[1]: Starting nova_compute container... Nov 23 04:29:49 localhost systemd[1]: Started libcrun container. Nov 23 04:29:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 23 04:29:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 23 04:29:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 04:29:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 04:29:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 04:29:49 localhost podman[229713]: 2025-11-23 09:29:49.544245946 +0000 UTC m=+0.123088010 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible) Nov 23 04:29:49 localhost podman[229713]: 2025-11-23 09:29:49.555489887 +0000 UTC m=+0.134331941 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:29:49 localhost podman[229713]: nova_compute Nov 23 04:29:49 localhost nova_compute[229727]: + sudo -E kolla_set_configs Nov 23 04:29:49 localhost systemd[1]: Started nova_compute container. Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Validating config file Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying service configuration files Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Deleting /etc/ceph Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Creating directory /etc/ceph Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Writing out command to execute Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:29:49 localhost nova_compute[229727]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:29:49 localhost nova_compute[229727]: ++ cat /run_command Nov 23 04:29:49 localhost nova_compute[229727]: + CMD=nova-compute Nov 23 04:29:49 localhost nova_compute[229727]: + ARGS= Nov 23 04:29:49 localhost nova_compute[229727]: + sudo kolla_copy_cacerts Nov 23 04:29:49 localhost nova_compute[229727]: + [[ ! -n '' ]] Nov 23 04:29:49 localhost nova_compute[229727]: + . kolla_extend_start Nov 23 04:29:49 localhost nova_compute[229727]: Running command: 'nova-compute' Nov 23 04:29:49 localhost nova_compute[229727]: + echo 'Running command: '\''nova-compute'\''' Nov 23 04:29:49 localhost nova_compute[229727]: + umask 0022 Nov 23 04:29:49 localhost nova_compute[229727]: + exec nova-compute Nov 23 04:29:50 localhost python3.9[229847]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.344 229731 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.345 229731 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.459 229731 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.479 229731 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.480 229731 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 23 04:29:51 localhost python3.9[229957]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:51 localhost nova_compute[229727]: 2025-11-23 09:29:51.898 229731 INFO nova.virt.driver [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.023 229731 INFO nova.compute.provider_config [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.076 229731 WARNING nova.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.076 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.077 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.077 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.078 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.079 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.080 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.081 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console_host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.082 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.083 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.084 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.085 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.086 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.087 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.088 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.089 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.090 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.091 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.092 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.093 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.094 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.095 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.096 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.097 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.098 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.099 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.100 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.101 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.102 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.103 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.104 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.105 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.106 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.107 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.108 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.109 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.110 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.111 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.112 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.113 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.114 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.115 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.116 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.117 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.118 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.119 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.120 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.121 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.122 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.123 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.124 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.125 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.126 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.127 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.128 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.129 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.130 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.131 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.132 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.133 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.134 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.135 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.136 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.137 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.138 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.139 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.140 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.141 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.142 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.143 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.144 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.145 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.146 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.147 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.148 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.149 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.150 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.151 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.152 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.153 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.154 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.155 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.156 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.157 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.158 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.159 229731 WARNING oslo_config.cfg [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 23 04:29:52 localhost nova_compute[229727]: live_migration_uri is deprecated for removal in favor of two other options that Nov 23 04:29:52 localhost nova_compute[229727]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 23 04:29:52 localhost nova_compute[229727]: and ``live_migration_inbound_addr`` respectively. Nov 23 04:29:52 localhost nova_compute[229727]: ). Its value may be silently ignored in the future.#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.160 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.161 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_secret_uuid = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.162 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.163 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.164 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.165 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.166 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.167 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.168 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.169 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.170 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.171 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.172 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.173 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.174 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.175 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.176 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.177 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.178 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.179 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.180 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.181 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.182 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.183 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.184 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.185 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.186 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.187 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.188 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.189 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.190 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.191 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.192 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.193 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.194 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.195 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.196 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.197 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.198 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.199 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.200 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.201 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.202 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.203 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.204 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.205 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.206 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.207 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.208 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.209 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.210 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.211 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.212 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.213 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.214 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.215 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.216 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.217 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.218 229731 DEBUG oslo_service.service [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.219 229731 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.229 229731 INFO nova.virt.node [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.229 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.230 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.240 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.242 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.243 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.253 229731 DEBUG nova.virt.libvirt.volume.mount [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.264 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host capabilities Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 43895caf-e6c2-47af-84a5-6194e901da5c Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: x86_64 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v4 Nov 23 04:29:52 localhost nova_compute[229727]: AMD Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tcp Nov 23 04:29:52 localhost nova_compute[229727]: rdma Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 16116612 Nov 23 04:29:52 localhost nova_compute[229727]: 4029153 Nov 23 04:29:52 localhost nova_compute[229727]: 0 Nov 23 04:29:52 localhost nova_compute[229727]: 0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: selinux Nov 23 04:29:52 localhost nova_compute[229727]: 0 Nov 23 04:29:52 localhost nova_compute[229727]: system_u:system_r:svirt_t:s0 Nov 23 04:29:52 localhost nova_compute[229727]: system_u:system_r:svirt_tcg_t:s0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: dac Nov 23 04:29:52 localhost nova_compute[229727]: 0 Nov 23 04:29:52 localhost nova_compute[229727]: +107:+107 Nov 23 04:29:52 localhost nova_compute[229727]: +107:+107 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: hvm Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 32 Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-i440fx-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.8.0 Nov 23 04:29:52 localhost nova_compute[229727]: q35 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.4.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.5.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.3.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.4.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.2.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.2.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.0.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.0.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.1.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: hvm Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 64 Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-i440fx-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.8.0 Nov 23 04:29:52 localhost nova_compute[229727]: q35 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.4.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.5.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.3.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.4.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.2.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.2.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.0.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.0.0 Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel8.1.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: #033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.270 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.291 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-i440fx-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: i686 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: rom Nov 23 04:29:52 localhost nova_compute[229727]: pflash Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: yes Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: AMD Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 486 Nov 23 04:29:52 localhost nova_compute[229727]: 486-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Conroe Nov 23 04:29:52 localhost nova_compute[229727]: Conroe-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-IBPB Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v4 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v1 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v2 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v6 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v7 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Penryn Nov 23 04:29:52 localhost nova_compute[229727]: Penryn-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Westmere Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v2 Nov 23 04:29:52 localhost nova_compute[229727]: athlon Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: athlon-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: kvm32 Nov 23 04:29:52 localhost nova_compute[229727]: kvm32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: n270 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: n270-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pentium Nov 23 04:29:52 localhost nova_compute[229727]: pentium-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: phenom Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: phenom-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu32 Nov 23 04:29:52 localhost nova_compute[229727]: qemu32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: anonymous Nov 23 04:29:52 localhost nova_compute[229727]: memfd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: disk Nov 23 04:29:52 localhost nova_compute[229727]: cdrom Nov 23 04:29:52 localhost nova_compute[229727]: floppy Nov 23 04:29:52 localhost nova_compute[229727]: lun Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: ide Nov 23 04:29:52 localhost nova_compute[229727]: fdc Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: sata Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: vnc Nov 23 04:29:52 localhost nova_compute[229727]: egl-headless Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: subsystem Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: mandatory Nov 23 04:29:52 localhost nova_compute[229727]: requisite Nov 23 04:29:52 localhost nova_compute[229727]: optional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: pci Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: random Nov 23 04:29:52 localhost nova_compute[229727]: egd Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: path Nov 23 04:29:52 localhost nova_compute[229727]: handle Nov 23 04:29:52 localhost nova_compute[229727]: virtiofs Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tpm-tis Nov 23 04:29:52 localhost nova_compute[229727]: tpm-crb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: emulator Nov 23 04:29:52 localhost nova_compute[229727]: external Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 2.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: passt Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: isa Nov 23 04:29:52 localhost nova_compute[229727]: hyperv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: null Nov 23 04:29:52 localhost nova_compute[229727]: vc Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: dev Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: pipe Nov 23 04:29:52 localhost nova_compute[229727]: stdio Nov 23 04:29:52 localhost nova_compute[229727]: udp Nov 23 04:29:52 localhost nova_compute[229727]: tcp Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: qemu-vdagent Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: relaxed Nov 23 04:29:52 localhost nova_compute[229727]: vapic Nov 23 04:29:52 localhost nova_compute[229727]: spinlocks Nov 23 04:29:52 localhost nova_compute[229727]: vpindex Nov 23 04:29:52 localhost nova_compute[229727]: runtime Nov 23 04:29:52 localhost nova_compute[229727]: synic Nov 23 04:29:52 localhost nova_compute[229727]: stimer Nov 23 04:29:52 localhost nova_compute[229727]: reset Nov 23 04:29:52 localhost nova_compute[229727]: vendor_id Nov 23 04:29:52 localhost nova_compute[229727]: frequencies Nov 23 04:29:52 localhost nova_compute[229727]: reenlightenment Nov 23 04:29:52 localhost nova_compute[229727]: tlbflush Nov 23 04:29:52 localhost nova_compute[229727]: ipi Nov 23 04:29:52 localhost nova_compute[229727]: avic Nov 23 04:29:52 localhost nova_compute[229727]: emsr_bitmap Nov 23 04:29:52 localhost nova_compute[229727]: xmm_input Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 4095 Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Linux KVM Hv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tdx Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.295 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.8.0 Nov 23 04:29:52 localhost nova_compute[229727]: i686 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: rom Nov 23 04:29:52 localhost nova_compute[229727]: pflash Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: yes Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: AMD Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 486 Nov 23 04:29:52 localhost nova_compute[229727]: 486-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost python3.9[230067]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Conroe Nov 23 04:29:52 localhost nova_compute[229727]: Conroe-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-IBPB Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v4 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v1 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v2 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v6 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v7 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Penryn Nov 23 04:29:52 localhost nova_compute[229727]: Penryn-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Westmere Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v2 Nov 23 04:29:52 localhost nova_compute[229727]: athlon Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: athlon-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: kvm32 Nov 23 04:29:52 localhost nova_compute[229727]: kvm32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: n270 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: n270-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pentium Nov 23 04:29:52 localhost nova_compute[229727]: pentium-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: phenom Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: phenom-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu32 Nov 23 04:29:52 localhost nova_compute[229727]: qemu32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: anonymous Nov 23 04:29:52 localhost nova_compute[229727]: memfd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: disk Nov 23 04:29:52 localhost nova_compute[229727]: cdrom Nov 23 04:29:52 localhost nova_compute[229727]: floppy Nov 23 04:29:52 localhost nova_compute[229727]: lun Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: fdc Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: sata Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: vnc Nov 23 04:29:52 localhost nova_compute[229727]: egl-headless Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: subsystem Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: mandatory Nov 23 04:29:52 localhost nova_compute[229727]: requisite Nov 23 04:29:52 localhost nova_compute[229727]: optional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: pci Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: random Nov 23 04:29:52 localhost nova_compute[229727]: egd Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: path Nov 23 04:29:52 localhost nova_compute[229727]: handle Nov 23 04:29:52 localhost nova_compute[229727]: virtiofs Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tpm-tis Nov 23 04:29:52 localhost nova_compute[229727]: tpm-crb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: emulator Nov 23 04:29:52 localhost nova_compute[229727]: external Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 2.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: passt Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: isa Nov 23 04:29:52 localhost nova_compute[229727]: hyperv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: null Nov 23 04:29:52 localhost nova_compute[229727]: vc Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: dev Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: pipe Nov 23 04:29:52 localhost nova_compute[229727]: stdio Nov 23 04:29:52 localhost nova_compute[229727]: udp Nov 23 04:29:52 localhost nova_compute[229727]: tcp Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: qemu-vdagent Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: relaxed Nov 23 04:29:52 localhost nova_compute[229727]: vapic Nov 23 04:29:52 localhost nova_compute[229727]: spinlocks Nov 23 04:29:52 localhost nova_compute[229727]: vpindex Nov 23 04:29:52 localhost nova_compute[229727]: runtime Nov 23 04:29:52 localhost nova_compute[229727]: synic Nov 23 04:29:52 localhost nova_compute[229727]: stimer Nov 23 04:29:52 localhost nova_compute[229727]: reset Nov 23 04:29:52 localhost nova_compute[229727]: vendor_id Nov 23 04:29:52 localhost nova_compute[229727]: frequencies Nov 23 04:29:52 localhost nova_compute[229727]: reenlightenment Nov 23 04:29:52 localhost nova_compute[229727]: tlbflush Nov 23 04:29:52 localhost nova_compute[229727]: ipi Nov 23 04:29:52 localhost nova_compute[229727]: avic Nov 23 04:29:52 localhost nova_compute[229727]: emsr_bitmap Nov 23 04:29:52 localhost nova_compute[229727]: xmm_input Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 4095 Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Linux KVM Hv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tdx Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.327 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.332 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-i440fx-rhel7.6.0 Nov 23 04:29:52 localhost nova_compute[229727]: x86_64 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: rom Nov 23 04:29:52 localhost nova_compute[229727]: pflash Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: yes Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: AMD Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 486 Nov 23 04:29:52 localhost nova_compute[229727]: 486-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Conroe Nov 23 04:29:52 localhost nova_compute[229727]: Conroe-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-IBPB Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v4 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v1 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v2 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v6 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v7 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Penryn Nov 23 04:29:52 localhost nova_compute[229727]: Penryn-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Westmere Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v2 Nov 23 04:29:52 localhost nova_compute[229727]: athlon Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: athlon-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: kvm32 Nov 23 04:29:52 localhost nova_compute[229727]: kvm32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: n270 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: n270-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pentium Nov 23 04:29:52 localhost nova_compute[229727]: pentium-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: phenom Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: phenom-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu32 Nov 23 04:29:52 localhost nova_compute[229727]: qemu32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: anonymous Nov 23 04:29:52 localhost nova_compute[229727]: memfd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: disk Nov 23 04:29:52 localhost nova_compute[229727]: cdrom Nov 23 04:29:52 localhost nova_compute[229727]: floppy Nov 23 04:29:52 localhost nova_compute[229727]: lun Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: ide Nov 23 04:29:52 localhost nova_compute[229727]: fdc Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: sata Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: vnc Nov 23 04:29:52 localhost nova_compute[229727]: egl-headless Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: subsystem Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: mandatory Nov 23 04:29:52 localhost nova_compute[229727]: requisite Nov 23 04:29:52 localhost nova_compute[229727]: optional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: pci Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: random Nov 23 04:29:52 localhost nova_compute[229727]: egd Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: path Nov 23 04:29:52 localhost nova_compute[229727]: handle Nov 23 04:29:52 localhost nova_compute[229727]: virtiofs Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tpm-tis Nov 23 04:29:52 localhost nova_compute[229727]: tpm-crb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: emulator Nov 23 04:29:52 localhost nova_compute[229727]: external Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 2.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: passt Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: isa Nov 23 04:29:52 localhost nova_compute[229727]: hyperv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: null Nov 23 04:29:52 localhost nova_compute[229727]: vc Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: dev Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: pipe Nov 23 04:29:52 localhost nova_compute[229727]: stdio Nov 23 04:29:52 localhost nova_compute[229727]: udp Nov 23 04:29:52 localhost nova_compute[229727]: tcp Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: qemu-vdagent Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: relaxed Nov 23 04:29:52 localhost nova_compute[229727]: vapic Nov 23 04:29:52 localhost nova_compute[229727]: spinlocks Nov 23 04:29:52 localhost nova_compute[229727]: vpindex Nov 23 04:29:52 localhost nova_compute[229727]: runtime Nov 23 04:29:52 localhost nova_compute[229727]: synic Nov 23 04:29:52 localhost nova_compute[229727]: stimer Nov 23 04:29:52 localhost nova_compute[229727]: reset Nov 23 04:29:52 localhost nova_compute[229727]: vendor_id Nov 23 04:29:52 localhost nova_compute[229727]: frequencies Nov 23 04:29:52 localhost nova_compute[229727]: reenlightenment Nov 23 04:29:52 localhost nova_compute[229727]: tlbflush Nov 23 04:29:52 localhost nova_compute[229727]: ipi Nov 23 04:29:52 localhost nova_compute[229727]: avic Nov 23 04:29:52 localhost nova_compute[229727]: emsr_bitmap Nov 23 04:29:52 localhost nova_compute[229727]: xmm_input Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 4095 Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Linux KVM Hv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tdx Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.392 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/libexec/qemu-kvm Nov 23 04:29:52 localhost nova_compute[229727]: kvm Nov 23 04:29:52 localhost nova_compute[229727]: pc-q35-rhel9.8.0 Nov 23 04:29:52 localhost nova_compute[229727]: x86_64 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: efi Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 23 04:29:52 localhost nova_compute[229727]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: rom Nov 23 04:29:52 localhost nova_compute[229727]: pflash Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: yes Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: yes Nov 23 04:29:52 localhost nova_compute[229727]: no Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: AMD Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 486 Nov 23 04:29:52 localhost nova_compute[229727]: 486-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Broadwell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cascadelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Conroe Nov 23 04:29:52 localhost nova_compute[229727]: Conroe-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Cooperlake-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Denverton-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Dhyana-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Genoa-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-IBPB Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Milan-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-Rome-v4 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v1 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v2 Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: EPYC-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: GraniteRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Haswell-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-noTSX Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v6 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Icelake-Server-v7 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: IvyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: KnightsMill-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nehalem-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G1-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G4-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Opteron_G5-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Penryn Nov 23 04:29:52 localhost nova_compute[229727]: Penryn-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: SandyBridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SapphireRapids-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: SierraForest-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Client-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-noTSX-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Skylake-Server-v5 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v2 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v3 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Snowridge-v4 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Westmere Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-IBRS Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Westmere-v2 Nov 23 04:29:52 localhost nova_compute[229727]: athlon Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: athlon-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: core2duo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: coreduo-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: kvm32 Nov 23 04:29:52 localhost nova_compute[229727]: kvm32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64 Nov 23 04:29:52 localhost nova_compute[229727]: kvm64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: n270 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: n270-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pentium Nov 23 04:29:52 localhost nova_compute[229727]: pentium-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2 Nov 23 04:29:52 localhost nova_compute[229727]: pentium2-v1 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3 Nov 23 04:29:52 localhost nova_compute[229727]: pentium3-v1 Nov 23 04:29:52 localhost nova_compute[229727]: phenom Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: phenom-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu32 Nov 23 04:29:52 localhost nova_compute[229727]: qemu32-v1 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64 Nov 23 04:29:52 localhost nova_compute[229727]: qemu64-v1 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: anonymous Nov 23 04:29:52 localhost nova_compute[229727]: memfd Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: disk Nov 23 04:29:52 localhost nova_compute[229727]: cdrom Nov 23 04:29:52 localhost nova_compute[229727]: floppy Nov 23 04:29:52 localhost nova_compute[229727]: lun Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: fdc Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: sata Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: vnc Nov 23 04:29:52 localhost nova_compute[229727]: egl-headless Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: subsystem Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: mandatory Nov 23 04:29:52 localhost nova_compute[229727]: requisite Nov 23 04:29:52 localhost nova_compute[229727]: optional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: pci Nov 23 04:29:52 localhost nova_compute[229727]: scsi Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: virtio Nov 23 04:29:52 localhost nova_compute[229727]: virtio-transitional Nov 23 04:29:52 localhost nova_compute[229727]: virtio-non-transitional Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: random Nov 23 04:29:52 localhost nova_compute[229727]: egd Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: path Nov 23 04:29:52 localhost nova_compute[229727]: handle Nov 23 04:29:52 localhost nova_compute[229727]: virtiofs Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tpm-tis Nov 23 04:29:52 localhost nova_compute[229727]: tpm-crb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: emulator Nov 23 04:29:52 localhost nova_compute[229727]: external Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 2.0 Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: usb Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: qemu Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: builtin Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: default Nov 23 04:29:52 localhost nova_compute[229727]: passt Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: isa Nov 23 04:29:52 localhost nova_compute[229727]: hyperv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: null Nov 23 04:29:52 localhost nova_compute[229727]: vc Nov 23 04:29:52 localhost nova_compute[229727]: pty Nov 23 04:29:52 localhost nova_compute[229727]: dev Nov 23 04:29:52 localhost nova_compute[229727]: file Nov 23 04:29:52 localhost nova_compute[229727]: pipe Nov 23 04:29:52 localhost nova_compute[229727]: stdio Nov 23 04:29:52 localhost nova_compute[229727]: udp Nov 23 04:29:52 localhost nova_compute[229727]: tcp Nov 23 04:29:52 localhost nova_compute[229727]: unix Nov 23 04:29:52 localhost nova_compute[229727]: qemu-vdagent Nov 23 04:29:52 localhost nova_compute[229727]: dbus Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: relaxed Nov 23 04:29:52 localhost nova_compute[229727]: vapic Nov 23 04:29:52 localhost nova_compute[229727]: spinlocks Nov 23 04:29:52 localhost nova_compute[229727]: vpindex Nov 23 04:29:52 localhost nova_compute[229727]: runtime Nov 23 04:29:52 localhost nova_compute[229727]: synic Nov 23 04:29:52 localhost nova_compute[229727]: stimer Nov 23 04:29:52 localhost nova_compute[229727]: reset Nov 23 04:29:52 localhost nova_compute[229727]: vendor_id Nov 23 04:29:52 localhost nova_compute[229727]: frequencies Nov 23 04:29:52 localhost nova_compute[229727]: reenlightenment Nov 23 04:29:52 localhost nova_compute[229727]: tlbflush Nov 23 04:29:52 localhost nova_compute[229727]: ipi Nov 23 04:29:52 localhost nova_compute[229727]: avic Nov 23 04:29:52 localhost nova_compute[229727]: emsr_bitmap Nov 23 04:29:52 localhost nova_compute[229727]: xmm_input Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: 4095 Nov 23 04:29:52 localhost nova_compute[229727]: on Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: off Nov 23 04:29:52 localhost nova_compute[229727]: Linux KVM Hv Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: tdx Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: Nov 23 04:29:52 localhost nova_compute[229727]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.461 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.462 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.462 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.462 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Secure Boot support detected#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.465 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.465 229731 INFO nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.479 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.522 229731 INFO nova.virt.node [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.549 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.596 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.601 229731 DEBUG nova.virt.libvirt.vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005532585.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.602 229731 DEBUG nova.network.os_vif_util [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.603 229731 DEBUG nova.network.os_vif_util [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.604 229731 DEBUG os_vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.649 229731 DEBUG ovsdbapp.backend.ovs_idl [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.650 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.650 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.651 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.651 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.652 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.656 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.670 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.671 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.671 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:29:52 localhost nova_compute[229727]: 2025-11-23 09:29:52.672 229731 INFO oslo.privsep.daemon [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpwwhf7w6z/privsep.sock']#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.285 229731 INFO oslo.privsep.daemon [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.178 230114 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.184 230114 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.188 230114 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.188 230114 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230114#033[00m Nov 23 04:29:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26889 DF PROTO=TCP SPT=60580 DPT=9102 SEQ=561049474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFC7600000000001030307) Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.557 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.557 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.558 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.559 229731 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.559 229731 INFO os_vif [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.560 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.564 229731 DEBUG nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.565 229731 INFO nova.compute.manager [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 23 04:29:53 localhost nova_compute[229727]: 2025-11-23 09:29:53.818 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.175 229731 INFO nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating service version for nova-compute on np0005532585.localdomain from 57 to 66#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.206 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.207 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.207 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.208 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.209 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.655 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.725 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:29:54 localhost nova_compute[229727]: 2025-11-23 09:29:54.725 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:29:54 localhost systemd[1]: Started libvirt nodedev daemon. Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.043 229731 WARNING nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12918MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.044 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.200 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.201 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.201 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.214 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.227 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.227 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.241 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.275 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_CLMUL,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AVX2,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.354 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.817 229731 DEBUG oslo_concurrency.processutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.823 229731 DEBUG nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 23 04:29:55 localhost nova_compute[229727]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.823 229731 INFO nova.virt.libvirt.host [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.825 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.826 229731 DEBUG nova.virt.libvirt.driver [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.881 229731 DEBUG nova.scheduler.client.report [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updated inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.882 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating resource provider dae70d62-10f4-474c-9782-8c926a3641d5 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.882 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.936 229731 DEBUG nova.compute.provider_tree [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Updating resource provider dae70d62-10f4-474c-9782-8c926a3641d5 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.966 229731 DEBUG nova.compute.resource_tracker [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.966 229731 DEBUG oslo_concurrency.lockutils [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.967 229731 DEBUG nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.993 229731 DEBUG nova.service [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 23 04:29:55 localhost nova_compute[229727]: 2025-11-23 09:29:55.994 229731 DEBUG nova.servicegroup.drivers.db [None req-88da1da8-31fd-4226-aea2-8aca8d2d4341 - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 23 04:29:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9380 DF PROTO=TCP SPT=52692 DPT=9882 SEQ=3079844263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFD6210000000001030307) Nov 23 04:29:57 localhost python3.9[230384]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 23 04:29:57 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation. Nov 23 04:29:57 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:29:57 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:29:57 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:29:57 localhost nova_compute[229727]: 2025-11-23 09:29:57.700 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:29:58 localhost systemd[1]: tmp-crun.xqvLMH.mount: Deactivated successfully. Nov 23 04:29:58 localhost podman[230428]: 2025-11-23 09:29:58.013805409 +0000 UTC m=+0.069678428 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 04:29:58 localhost podman[230428]: 2025-11-23 09:29:58.021546304 +0000 UTC m=+0.077419293 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 04:29:58 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:29:58 localhost python3.9[230539]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:29:58 localhost systemd[1]: Stopping nova_compute container... Nov 23 04:29:58 localhost systemd[1]: tmp-crun.ki7ajJ.mount: Deactivated successfully. Nov 23 04:29:58 localhost nova_compute[229727]: 2025-11-23 09:29:58.811 229731 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 23 04:29:58 localhost nova_compute[229727]: 2025-11-23 09:29:58.862 229731 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:29:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39229 DF PROTO=TCP SPT=41864 DPT=9101 SEQ=3850135123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFE06A0000000001030307) Nov 23 04:29:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3606 DF PROTO=TCP SPT=41876 DPT=9105 SEQ=1463888140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFE11B0000000001030307) Nov 23 04:30:00 localhost nova_compute[229727]: 2025-11-23 09:30:00.108 229731 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Nov 23 04:30:00 localhost nova_compute[229727]: 2025-11-23 09:30:00.111 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:30:00 localhost nova_compute[229727]: 2025-11-23 09:30:00.112 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:30:00 localhost nova_compute[229727]: 2025-11-23 09:30:00.112 229731 DEBUG oslo_concurrency.lockutils [None req-f3890f07-384a-4735-a22f-97711fc0210c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:30:00 localhost journal[203731]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 23 04:30:00 localhost journal[203731]: hostname: np0005532585.localdomain Nov 23 04:30:00 localhost journal[203731]: End of file while reading data: Input/output error Nov 23 04:30:00 localhost systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Deactivated successfully. Nov 23 04:30:00 localhost systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Consumed 4.843s CPU time. Nov 23 04:30:00 localhost podman[230543]: 2025-11-23 09:30:00.510094373 +0000 UTC m=+1.773216252 container died 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Nov 23 04:30:00 localhost systemd[1]: tmp-crun.HmzNB2.mount: Deactivated successfully. Nov 23 04:30:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295-userdata-shm.mount: Deactivated successfully. Nov 23 04:30:00 localhost podman[230543]: 2025-11-23 09:30:00.564371001 +0000 UTC m=+1.827492850 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 23 04:30:00 localhost podman[230543]: nova_compute Nov 23 04:30:00 localhost podman[230581]: error opening file `/run/crun/2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295/status`: No such file or directory Nov 23 04:30:00 localhost podman[230570]: 2025-11-23 09:30:00.659056848 +0000 UTC m=+0.066521332 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 23 04:30:00 localhost podman[230570]: nova_compute Nov 23 04:30:00 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 23 04:30:00 localhost systemd[1]: Stopped nova_compute container. Nov 23 04:30:00 localhost systemd[1]: Starting nova_compute container... Nov 23 04:30:00 localhost systemd[1]: Started libcrun container. Nov 23 04:30:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:00 localhost podman[230585]: 2025-11-23 09:30:00.806742314 +0000 UTC m=+0.118150630 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=edpm) Nov 23 04:30:00 localhost podman[230585]: 2025-11-23 09:30:00.817076077 +0000 UTC m=+0.128484393 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 23 04:30:00 localhost podman[230585]: nova_compute Nov 23 04:30:00 localhost nova_compute[230600]: + sudo -E kolla_set_configs Nov 23 04:30:00 localhost systemd[1]: Started nova_compute container. Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Validating config file Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying service configuration files Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /etc/ceph Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Creating directory /etc/ceph Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Writing out command to execute Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:30:00 localhost nova_compute[230600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:30:00 localhost nova_compute[230600]: ++ cat /run_command Nov 23 04:30:00 localhost nova_compute[230600]: + CMD=nova-compute Nov 23 04:30:00 localhost nova_compute[230600]: + ARGS= Nov 23 04:30:00 localhost nova_compute[230600]: + sudo kolla_copy_cacerts Nov 23 04:30:00 localhost nova_compute[230600]: + [[ ! -n '' ]] Nov 23 04:30:00 localhost nova_compute[230600]: + . kolla_extend_start Nov 23 04:30:00 localhost nova_compute[230600]: + echo 'Running command: '\''nova-compute'\''' Nov 23 04:30:00 localhost nova_compute[230600]: Running command: 'nova-compute' Nov 23 04:30:00 localhost nova_compute[230600]: + umask 0022 Nov 23 04:30:00 localhost nova_compute[230600]: + exec nova-compute Nov 23 04:30:01 localhost python3.9[230721]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 23 04:30:01 localhost systemd[1]: Started libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope. Nov 23 04:30:02 localhost systemd[1]: Started libcrun container. Nov 23 04:30:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 04:30:02 localhost podman[230746]: 2025-11-23 09:30:02.026920436 +0000 UTC m=+0.129822704 container init 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:30:02 localhost podman[230746]: 2025-11-23 09:30:02.037693263 +0000 UTC m=+0.140595501 container start 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:30:02 localhost python3.9[230721]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Applying nova statedir ownership Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/console.log Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f8def1b80727f8e5cc38a877010a5f81bbb3086d Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f8def1b80727f8e5cc38a877010a5f81bbb3086d Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd Nov 23 04:30:02 localhost nova_compute_init[230766]: INFO:nova_statedir:Nova statedir ownership complete Nov 23 04:30:02 localhost systemd[1]: libpod-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully. Nov 23 04:30:02 localhost podman[230767]: 2025-11-23 09:30:02.105356469 +0000 UTC m=+0.050662420 container died 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible) Nov 23 04:30:02 localhost podman[230778]: 2025-11-23 09:30:02.179322166 +0000 UTC m=+0.073820124 container cleanup 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3) Nov 23 04:30:02 localhost systemd[1]: libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully. Nov 23 04:30:02 localhost systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully. Nov 23 04:30:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428-userdata-shm.mount: Deactivated successfully. Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.571 230604 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.572 230604 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.572 230604 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.572 230604 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.688 230604 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.708 230604 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:30:02 localhost nova_compute[230600]: 2025-11-23 09:30:02.708 230604 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 23 04:30:02 localhost systemd[1]: session-53.scope: Deactivated successfully. Nov 23 04:30:02 localhost systemd[1]: session-53.scope: Consumed 2min 12.019s CPU time. Nov 23 04:30:02 localhost systemd-logind[761]: Session 53 logged out. Waiting for processes to exit. Nov 23 04:30:02 localhost systemd-logind[761]: Removed session 53. Nov 23 04:30:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3608 DF PROTO=TCP SPT=41876 DPT=9105 SEQ=1463888140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFED200000000001030307) Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.081 230604 INFO nova.virt.driver [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.197 230604 INFO nova.compute.provider_config [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.203 230604 WARNING nova.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.204 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.205 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.206 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console_host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.207 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.208 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.209 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.210 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.211 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.212 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.213 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.214 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.215 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.216 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.217 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.218 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.219 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.220 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.221 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.222 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.223 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.224 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.225 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.226 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.227 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.228 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.229 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.230 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.231 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.232 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.233 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.234 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.235 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.236 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.237 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.238 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.239 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.240 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.241 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.242 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.243 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.244 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.245 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.246 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.247 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.248 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.249 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.250 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.251 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.252 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.253 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.254 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.255 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.256 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.257 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.258 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.259 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.260 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.261 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.262 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.263 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.263 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.264 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.265 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.266 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.267 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.268 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.269 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.270 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.271 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.272 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.273 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.274 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.275 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.276 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.277 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.278 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.279 230604 WARNING oslo_config.cfg [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 23 04:30:03 localhost nova_compute[230600]: live_migration_uri is deprecated for removal in favor of two other options that Nov 23 04:30:03 localhost nova_compute[230600]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 23 04:30:03 localhost nova_compute[230600]: and ``live_migration_inbound_addr`` respectively. Nov 23 04:30:03 localhost nova_compute[230600]: ). Its value may be silently ignored in the future.#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.279 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.280 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.281 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_secret_uuid = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.282 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.283 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.284 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.285 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.286 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.287 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.288 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.289 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.290 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.291 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.292 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.293 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.294 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.295 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.296 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.297 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.298 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.299 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.300 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.301 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.302 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.303 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.304 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.305 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.306 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.307 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.308 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.309 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.310 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.311 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.312 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.313 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.314 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.315 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.316 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.317 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.318 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.319 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.320 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.321 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.322 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.323 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.324 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.325 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.326 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.327 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.327 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.328 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.329 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.330 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.331 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.332 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.333 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.334 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.335 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.336 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.337 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.338 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.339 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.340 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.341 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.342 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.343 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.343 230604 DEBUG oslo_service.service [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.344 230604 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.357 230604 INFO nova.virt.node [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.357 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.358 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.368 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.370 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.371 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Connection event '1' reason 'None'#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.376 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host capabilities Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 43895caf-e6c2-47af-84a5-6194e901da5c Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: x86_64 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v4 Nov 23 04:30:03 localhost nova_compute[230600]: AMD Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tcp Nov 23 04:30:03 localhost nova_compute[230600]: rdma Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 16116612 Nov 23 04:30:03 localhost nova_compute[230600]: 4029153 Nov 23 04:30:03 localhost nova_compute[230600]: 0 Nov 23 04:30:03 localhost nova_compute[230600]: 0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: selinux Nov 23 04:30:03 localhost nova_compute[230600]: 0 Nov 23 04:30:03 localhost nova_compute[230600]: system_u:system_r:svirt_t:s0 Nov 23 04:30:03 localhost nova_compute[230600]: system_u:system_r:svirt_tcg_t:s0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: dac Nov 23 04:30:03 localhost nova_compute[230600]: 0 Nov 23 04:30:03 localhost nova_compute[230600]: +107:+107 Nov 23 04:30:03 localhost nova_compute[230600]: +107:+107 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: hvm Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 32 Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-i440fx-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.8.0 Nov 23 04:30:03 localhost nova_compute[230600]: q35 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.4.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.5.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.3.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.4.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.2.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.2.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.0.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.0.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.1.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: hvm Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 64 Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-i440fx-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.8.0 Nov 23 04:30:03 localhost nova_compute[230600]: q35 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.4.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.5.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.3.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.4.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.2.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.2.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.0.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.0.0 Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel8.1.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: #033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.384 230604 DEBUG nova.virt.libvirt.volume.mount [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.385 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.390 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.8.0 Nov 23 04:30:03 localhost nova_compute[230600]: i686 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: rom Nov 23 04:30:03 localhost nova_compute[230600]: pflash Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: yes Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: AMD Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 486 Nov 23 04:30:03 localhost nova_compute[230600]: 486-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Conroe Nov 23 04:30:03 localhost nova_compute[230600]: Conroe-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-IBPB Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v4 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v1 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v2 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v6 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v7 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Penryn Nov 23 04:30:03 localhost nova_compute[230600]: Penryn-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Westmere Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v2 Nov 23 04:30:03 localhost nova_compute[230600]: athlon Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: athlon-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: kvm32 Nov 23 04:30:03 localhost nova_compute[230600]: kvm32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: n270 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: n270-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pentium Nov 23 04:30:03 localhost nova_compute[230600]: pentium-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: phenom Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: phenom-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu32 Nov 23 04:30:03 localhost nova_compute[230600]: qemu32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: anonymous Nov 23 04:30:03 localhost nova_compute[230600]: memfd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: disk Nov 23 04:30:03 localhost nova_compute[230600]: cdrom Nov 23 04:30:03 localhost nova_compute[230600]: floppy Nov 23 04:30:03 localhost nova_compute[230600]: lun Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: fdc Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: sata Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: vnc Nov 23 04:30:03 localhost nova_compute[230600]: egl-headless Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: subsystem Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: mandatory Nov 23 04:30:03 localhost nova_compute[230600]: requisite Nov 23 04:30:03 localhost nova_compute[230600]: optional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: pci Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: random Nov 23 04:30:03 localhost nova_compute[230600]: egd Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: path Nov 23 04:30:03 localhost nova_compute[230600]: handle Nov 23 04:30:03 localhost nova_compute[230600]: virtiofs Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tpm-tis Nov 23 04:30:03 localhost nova_compute[230600]: tpm-crb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: emulator Nov 23 04:30:03 localhost nova_compute[230600]: external Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 2.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: passt Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: isa Nov 23 04:30:03 localhost nova_compute[230600]: hyperv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: null Nov 23 04:30:03 localhost nova_compute[230600]: vc Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: dev Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: pipe Nov 23 04:30:03 localhost nova_compute[230600]: stdio Nov 23 04:30:03 localhost nova_compute[230600]: udp Nov 23 04:30:03 localhost nova_compute[230600]: tcp Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: qemu-vdagent Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: relaxed Nov 23 04:30:03 localhost nova_compute[230600]: vapic Nov 23 04:30:03 localhost nova_compute[230600]: spinlocks Nov 23 04:30:03 localhost nova_compute[230600]: vpindex Nov 23 04:30:03 localhost nova_compute[230600]: runtime Nov 23 04:30:03 localhost nova_compute[230600]: synic Nov 23 04:30:03 localhost nova_compute[230600]: stimer Nov 23 04:30:03 localhost nova_compute[230600]: reset Nov 23 04:30:03 localhost nova_compute[230600]: vendor_id Nov 23 04:30:03 localhost nova_compute[230600]: frequencies Nov 23 04:30:03 localhost nova_compute[230600]: reenlightenment Nov 23 04:30:03 localhost nova_compute[230600]: tlbflush Nov 23 04:30:03 localhost nova_compute[230600]: ipi Nov 23 04:30:03 localhost nova_compute[230600]: avic Nov 23 04:30:03 localhost nova_compute[230600]: emsr_bitmap Nov 23 04:30:03 localhost nova_compute[230600]: xmm_input Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 4095 Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Linux KVM Hv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tdx Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.400 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-i440fx-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: i686 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: rom Nov 23 04:30:03 localhost nova_compute[230600]: pflash Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: yes Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: AMD Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 486 Nov 23 04:30:03 localhost nova_compute[230600]: 486-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Conroe Nov 23 04:30:03 localhost nova_compute[230600]: Conroe-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-IBPB Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v4 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v1 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v2 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v6 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v7 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Penryn Nov 23 04:30:03 localhost nova_compute[230600]: Penryn-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Westmere Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v2 Nov 23 04:30:03 localhost nova_compute[230600]: athlon Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: athlon-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: kvm32 Nov 23 04:30:03 localhost nova_compute[230600]: kvm32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: n270 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: n270-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pentium Nov 23 04:30:03 localhost nova_compute[230600]: pentium-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: phenom Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: phenom-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu32 Nov 23 04:30:03 localhost nova_compute[230600]: qemu32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: anonymous Nov 23 04:30:03 localhost nova_compute[230600]: memfd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: disk Nov 23 04:30:03 localhost nova_compute[230600]: cdrom Nov 23 04:30:03 localhost nova_compute[230600]: floppy Nov 23 04:30:03 localhost nova_compute[230600]: lun Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: ide Nov 23 04:30:03 localhost nova_compute[230600]: fdc Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: sata Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: vnc Nov 23 04:30:03 localhost nova_compute[230600]: egl-headless Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: subsystem Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: mandatory Nov 23 04:30:03 localhost nova_compute[230600]: requisite Nov 23 04:30:03 localhost nova_compute[230600]: optional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: pci Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: random Nov 23 04:30:03 localhost nova_compute[230600]: egd Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: path Nov 23 04:30:03 localhost nova_compute[230600]: handle Nov 23 04:30:03 localhost nova_compute[230600]: virtiofs Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tpm-tis Nov 23 04:30:03 localhost nova_compute[230600]: tpm-crb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: emulator Nov 23 04:30:03 localhost nova_compute[230600]: external Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 2.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: passt Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: isa Nov 23 04:30:03 localhost nova_compute[230600]: hyperv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: null Nov 23 04:30:03 localhost nova_compute[230600]: vc Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: dev Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: pipe Nov 23 04:30:03 localhost nova_compute[230600]: stdio Nov 23 04:30:03 localhost nova_compute[230600]: udp Nov 23 04:30:03 localhost nova_compute[230600]: tcp Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: qemu-vdagent Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: relaxed Nov 23 04:30:03 localhost nova_compute[230600]: vapic Nov 23 04:30:03 localhost nova_compute[230600]: spinlocks Nov 23 04:30:03 localhost nova_compute[230600]: vpindex Nov 23 04:30:03 localhost nova_compute[230600]: runtime Nov 23 04:30:03 localhost nova_compute[230600]: synic Nov 23 04:30:03 localhost nova_compute[230600]: stimer Nov 23 04:30:03 localhost nova_compute[230600]: reset Nov 23 04:30:03 localhost nova_compute[230600]: vendor_id Nov 23 04:30:03 localhost nova_compute[230600]: frequencies Nov 23 04:30:03 localhost nova_compute[230600]: reenlightenment Nov 23 04:30:03 localhost nova_compute[230600]: tlbflush Nov 23 04:30:03 localhost nova_compute[230600]: ipi Nov 23 04:30:03 localhost nova_compute[230600]: avic Nov 23 04:30:03 localhost nova_compute[230600]: emsr_bitmap Nov 23 04:30:03 localhost nova_compute[230600]: xmm_input Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 4095 Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Linux KVM Hv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tdx Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.417 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.421 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-q35-rhel9.8.0 Nov 23 04:30:03 localhost nova_compute[230600]: x86_64 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: efi Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: rom Nov 23 04:30:03 localhost nova_compute[230600]: pflash Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: yes Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: yes Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: AMD Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 486 Nov 23 04:30:03 localhost nova_compute[230600]: 486-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Conroe Nov 23 04:30:03 localhost nova_compute[230600]: Conroe-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-IBPB Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v4 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v1 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v2 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v6 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v7 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Penryn Nov 23 04:30:03 localhost nova_compute[230600]: Penryn-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Westmere Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v2 Nov 23 04:30:03 localhost nova_compute[230600]: athlon Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: athlon-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: kvm32 Nov 23 04:30:03 localhost nova_compute[230600]: kvm32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: n270 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: n270-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pentium Nov 23 04:30:03 localhost nova_compute[230600]: pentium-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: phenom Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: phenom-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu32 Nov 23 04:30:03 localhost nova_compute[230600]: qemu32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: anonymous Nov 23 04:30:03 localhost nova_compute[230600]: memfd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: disk Nov 23 04:30:03 localhost nova_compute[230600]: cdrom Nov 23 04:30:03 localhost nova_compute[230600]: floppy Nov 23 04:30:03 localhost nova_compute[230600]: lun Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: fdc Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: sata Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: vnc Nov 23 04:30:03 localhost nova_compute[230600]: egl-headless Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: subsystem Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: mandatory Nov 23 04:30:03 localhost nova_compute[230600]: requisite Nov 23 04:30:03 localhost nova_compute[230600]: optional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: pci Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: random Nov 23 04:30:03 localhost nova_compute[230600]: egd Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: path Nov 23 04:30:03 localhost nova_compute[230600]: handle Nov 23 04:30:03 localhost nova_compute[230600]: virtiofs Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tpm-tis Nov 23 04:30:03 localhost nova_compute[230600]: tpm-crb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: emulator Nov 23 04:30:03 localhost nova_compute[230600]: external Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 2.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: passt Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: isa Nov 23 04:30:03 localhost nova_compute[230600]: hyperv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: null Nov 23 04:30:03 localhost nova_compute[230600]: vc Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: dev Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: pipe Nov 23 04:30:03 localhost nova_compute[230600]: stdio Nov 23 04:30:03 localhost nova_compute[230600]: udp Nov 23 04:30:03 localhost nova_compute[230600]: tcp Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: qemu-vdagent Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: relaxed Nov 23 04:30:03 localhost nova_compute[230600]: vapic Nov 23 04:30:03 localhost nova_compute[230600]: spinlocks Nov 23 04:30:03 localhost nova_compute[230600]: vpindex Nov 23 04:30:03 localhost nova_compute[230600]: runtime Nov 23 04:30:03 localhost nova_compute[230600]: synic Nov 23 04:30:03 localhost nova_compute[230600]: stimer Nov 23 04:30:03 localhost nova_compute[230600]: reset Nov 23 04:30:03 localhost nova_compute[230600]: vendor_id Nov 23 04:30:03 localhost nova_compute[230600]: frequencies Nov 23 04:30:03 localhost nova_compute[230600]: reenlightenment Nov 23 04:30:03 localhost nova_compute[230600]: tlbflush Nov 23 04:30:03 localhost nova_compute[230600]: ipi Nov 23 04:30:03 localhost nova_compute[230600]: avic Nov 23 04:30:03 localhost nova_compute[230600]: emsr_bitmap Nov 23 04:30:03 localhost nova_compute[230600]: xmm_input Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 4095 Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Linux KVM Hv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tdx Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.467 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/libexec/qemu-kvm Nov 23 04:30:03 localhost nova_compute[230600]: kvm Nov 23 04:30:03 localhost nova_compute[230600]: pc-i440fx-rhel7.6.0 Nov 23 04:30:03 localhost nova_compute[230600]: x86_64 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: rom Nov 23 04:30:03 localhost nova_compute[230600]: pflash Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: yes Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: no Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: AMD Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 486 Nov 23 04:30:03 localhost nova_compute[230600]: 486-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Broadwell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cascadelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Conroe Nov 23 04:30:03 localhost nova_compute[230600]: Conroe-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Cooperlake-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Denverton-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Dhyana-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Genoa-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-IBPB Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Milan-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-Rome-v4 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v1 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v2 Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: EPYC-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: GraniteRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Haswell-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-noTSX Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v6 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Icelake-Server-v7 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: IvyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: KnightsMill-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nehalem-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G1-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G4-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Opteron_G5-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Penryn Nov 23 04:30:03 localhost nova_compute[230600]: Penryn-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: SandyBridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SapphireRapids-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: SierraForest-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Client-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-noTSX-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Skylake-Server-v5 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v2 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v3 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Snowridge-v4 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Westmere Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-IBRS Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Westmere-v2 Nov 23 04:30:03 localhost nova_compute[230600]: athlon Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: athlon-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: core2duo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: coreduo-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: kvm32 Nov 23 04:30:03 localhost nova_compute[230600]: kvm32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64 Nov 23 04:30:03 localhost nova_compute[230600]: kvm64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: n270 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: n270-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pentium Nov 23 04:30:03 localhost nova_compute[230600]: pentium-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2 Nov 23 04:30:03 localhost nova_compute[230600]: pentium2-v1 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3 Nov 23 04:30:03 localhost nova_compute[230600]: pentium3-v1 Nov 23 04:30:03 localhost nova_compute[230600]: phenom Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: phenom-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu32 Nov 23 04:30:03 localhost nova_compute[230600]: qemu32-v1 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64 Nov 23 04:30:03 localhost nova_compute[230600]: qemu64-v1 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: anonymous Nov 23 04:30:03 localhost nova_compute[230600]: memfd Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: disk Nov 23 04:30:03 localhost nova_compute[230600]: cdrom Nov 23 04:30:03 localhost nova_compute[230600]: floppy Nov 23 04:30:03 localhost nova_compute[230600]: lun Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: ide Nov 23 04:30:03 localhost nova_compute[230600]: fdc Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: sata Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: vnc Nov 23 04:30:03 localhost nova_compute[230600]: egl-headless Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: subsystem Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: mandatory Nov 23 04:30:03 localhost nova_compute[230600]: requisite Nov 23 04:30:03 localhost nova_compute[230600]: optional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: pci Nov 23 04:30:03 localhost nova_compute[230600]: scsi Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: virtio Nov 23 04:30:03 localhost nova_compute[230600]: virtio-transitional Nov 23 04:30:03 localhost nova_compute[230600]: virtio-non-transitional Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: random Nov 23 04:30:03 localhost nova_compute[230600]: egd Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: path Nov 23 04:30:03 localhost nova_compute[230600]: handle Nov 23 04:30:03 localhost nova_compute[230600]: virtiofs Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tpm-tis Nov 23 04:30:03 localhost nova_compute[230600]: tpm-crb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: emulator Nov 23 04:30:03 localhost nova_compute[230600]: external Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 2.0 Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: usb Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: qemu Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: builtin Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: default Nov 23 04:30:03 localhost nova_compute[230600]: passt Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: isa Nov 23 04:30:03 localhost nova_compute[230600]: hyperv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: null Nov 23 04:30:03 localhost nova_compute[230600]: vc Nov 23 04:30:03 localhost nova_compute[230600]: pty Nov 23 04:30:03 localhost nova_compute[230600]: dev Nov 23 04:30:03 localhost nova_compute[230600]: file Nov 23 04:30:03 localhost nova_compute[230600]: pipe Nov 23 04:30:03 localhost nova_compute[230600]: stdio Nov 23 04:30:03 localhost nova_compute[230600]: udp Nov 23 04:30:03 localhost nova_compute[230600]: tcp Nov 23 04:30:03 localhost nova_compute[230600]: unix Nov 23 04:30:03 localhost nova_compute[230600]: qemu-vdagent Nov 23 04:30:03 localhost nova_compute[230600]: dbus Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: relaxed Nov 23 04:30:03 localhost nova_compute[230600]: vapic Nov 23 04:30:03 localhost nova_compute[230600]: spinlocks Nov 23 04:30:03 localhost nova_compute[230600]: vpindex Nov 23 04:30:03 localhost nova_compute[230600]: runtime Nov 23 04:30:03 localhost nova_compute[230600]: synic Nov 23 04:30:03 localhost nova_compute[230600]: stimer Nov 23 04:30:03 localhost nova_compute[230600]: reset Nov 23 04:30:03 localhost nova_compute[230600]: vendor_id Nov 23 04:30:03 localhost nova_compute[230600]: frequencies Nov 23 04:30:03 localhost nova_compute[230600]: reenlightenment Nov 23 04:30:03 localhost nova_compute[230600]: tlbflush Nov 23 04:30:03 localhost nova_compute[230600]: ipi Nov 23 04:30:03 localhost nova_compute[230600]: avic Nov 23 04:30:03 localhost nova_compute[230600]: emsr_bitmap Nov 23 04:30:03 localhost nova_compute[230600]: xmm_input Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: 4095 Nov 23 04:30:03 localhost nova_compute[230600]: on Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: off Nov 23 04:30:03 localhost nova_compute[230600]: Linux KVM Hv Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: tdx Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: Nov 23 04:30:03 localhost nova_compute[230600]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.557 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.558 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Secure Boot support detected#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.560 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.561 230604 INFO nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.576 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.599 230604 INFO nova.virt.node [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.619 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.646 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.650 230604 DEBUG nova.virt.libvirt.vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005532585.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.651 230604 DEBUG nova.network.os_vif_util [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.652 230604 DEBUG nova.network.os_vif_util [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.653 230604 DEBUG os_vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.745 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.746 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.747 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.748 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.759 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.760 230604 INFO oslo.privsep.daemon [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp05yuyu3y/privsep.sock']#033[00m Nov 23 04:30:03 localhost nova_compute[230600]: 2025-11-23 09:30:03.864 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.331 230604 INFO oslo.privsep.daemon [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.230 230850 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.233 230850 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.235 230850 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.235 230850 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230850#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.615 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.615 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.616 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.617 230604 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.617 230604 INFO os_vif [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.618 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.621 230604 DEBUG nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.622 230604 INFO nova.compute.manager [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.708 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.709 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.709 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.710 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:30:04 localhost nova_compute[230600]: 2025-11-23 09:30:04.711 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.138 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.210 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.211 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.420 230604 WARNING nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.422 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12952MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.422 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.423 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.584 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.585 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.585 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.701 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.723 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.724 230604 DEBUG nova.compute.provider_tree [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.737 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.756 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_F16C,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:30:05 localhost nova_compute[230600]: 2025-11-23 09:30:05.793 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.246 230604 DEBUG oslo_concurrency.processutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.250 230604 DEBUG nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 23 04:30:06 localhost nova_compute[230600]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.250 230604 INFO nova.virt.libvirt.host [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.251 230604 DEBUG nova.compute.provider_tree [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.251 230604 DEBUG nova.virt.libvirt.driver [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.272 230604 DEBUG nova.scheduler.client.report [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG nova.compute.resource_tracker [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG oslo_concurrency.lockutils [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.294 230604 DEBUG nova.service [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.315 230604 DEBUG nova.service [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 23 04:30:06 localhost nova_compute[230600]: 2025-11-23 09:30:06.316 230604 DEBUG nova.servicegroup.drivers.db [None req-58b917db-5360-4d7d-b7f3-2ca75c21a22b - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 23 04:30:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40931 DF PROTO=TCP SPT=47280 DPT=9100 SEQ=2376654661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2CFFA210000000001030307) Nov 23 04:30:08 localhost sshd[230898]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:30:08 localhost nova_compute[230600]: 2025-11-23 09:30:08.798 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:08 localhost systemd-logind[761]: New session 55 of user zuul. Nov 23 04:30:08 localhost nova_compute[230600]: 2025-11-23 09:30:08.867 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:08 localhost systemd[1]: Started Session 55 of User zuul. Nov 23 04:30:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:30:09.240 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:30:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:30:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:30:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:30:09.242 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5701 DF PROTO=TCP SPT=42040 DPT=9100 SEQ=722781209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D005600000000001030307) Nov 23 04:30:09 localhost python3.9[231009]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:30:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58460 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D00F960000000001030307) Nov 23 04:30:11 localhost python3.9[231123]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:30:12 localhost systemd[1]: Reloading. Nov 23 04:30:12 localhost systemd-sysv-generator[231146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:30:12 localhost systemd-rc-local-generator[231143]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:13 localhost python3.9[231267]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:30:13 localhost network[231284]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:30:13 localhost network[231285]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:30:13 localhost network[231286]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:30:13 localhost nova_compute[230600]: 2025-11-23 09:30:13.833 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:13 localhost nova_compute[230600]: 2025-11-23 09:30:13.869 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:13 localhost podman[231292]: 2025-11-23 09:30:13.902552163 +0000 UTC m=+0.060109132 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:30:13 localhost podman[231292]: 2025-11-23 09:30:13.932743378 +0000 UTC m=+0.090300337 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:30:14 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:30:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58462 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D01BA10000000001030307) Nov 23 04:30:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:30:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26891 DF PROTO=TCP SPT=60580 DPT=9102 SEQ=561049474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D028200000000001030307) Nov 23 04:30:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:30:18 localhost podman[231548]: 2025-11-23 09:30:18.826817153 +0000 UTC m=+0.079753632 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 23 04:30:18 localhost podman[231548]: 2025-11-23 09:30:18.863857004 +0000 UTC m=+0.116793443 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 04:30:18 localhost nova_compute[230600]: 2025-11-23 09:30:18.867 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:18 localhost nova_compute[230600]: 2025-11-23 09:30:18.870 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:18 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:30:19 localhost python3.9[231549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:30:19 localhost python3.9[231678]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:19 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation. Nov 23 04:30:19 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:30:19 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:30:20 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:30:20 localhost python3.9[231789]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:21 localhost python3.9[231899]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:30:22 localhost python3.9[232009]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42161 DF PROTO=TCP SPT=44684 DPT=9102 SEQ=3396325974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D03CA00000000001030307) Nov 23 04:30:23 localhost python3.9[232119]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:30:23 localhost systemd[1]: Reloading. Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.872 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.904 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:23 localhost nova_compute[230600]: 2025-11-23 09:30:23.905 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:23 localhost systemd-sysv-generator[232147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:30:23 localhost systemd-rc-local-generator[232143]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:25 localhost python3.9[232265]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:30:27 localhost python3.9[232376]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:30:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58464 DF PROTO=TCP SPT=53278 DPT=9882 SEQ=4002580767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D04C210000000001030307) Nov 23 04:30:27 localhost python3.9[232484]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:30:28 localhost python3.9[232594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.906 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.908 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.938 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:28 localhost nova_compute[230600]: 2025-11-23 09:30:28.938 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:30:28 localhost python3.9[232680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890227.9986708-360-127162955015351/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a1c197ac7c699777a1adad471c9d81e692c62960 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:30:29 localhost podman[232681]: 2025-11-23 09:30:29.054827944 +0000 UTC m=+0.085109082 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:30:29 localhost podman[232681]: 2025-11-23 09:30:29.091147892 +0000 UTC m=+0.121429050 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:30:29 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:30:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65406 DF PROTO=TCP SPT=59422 DPT=9101 SEQ=981976127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0559A0000000001030307) Nov 23 04:30:29 localhost python3.9[232810]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Nov 23 04:30:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4137 DF PROTO=TCP SPT=43594 DPT=9105 SEQ=4180956252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0564C0000000001030307) Nov 23 04:30:30 localhost python3.9[232920]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Nov 23 04:30:31 localhost python3.9[233031]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 23 04:30:32 localhost python3.9[233147]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532585.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 23 04:30:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65408 DF PROTO=TCP SPT=59422 DPT=9101 SEQ=981976127 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D061A10000000001030307) Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.940 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.942 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.977 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:33 localhost nova_compute[230600]: 2025-11-23 09:30:33.978 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:34 localhost python3.9[233263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:34 localhost python3.9[233349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890233.6412175-564-264951459368103/.source.conf _original_basename=ceilometer.conf follow=False checksum=950edd520595720a58ffe786d84e54d033109e91 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:35 localhost python3.9[233457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:35 localhost python3.9[233543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890234.9572551-564-163618447786307/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31313 DF PROTO=TCP SPT=51714 DPT=9100 SEQ=3062298622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D06E200000000001030307) Nov 23 04:30:36 localhost python3.9[233651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:36 localhost python3.9[233737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763890236.0000565-564-114090329836794/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:38 localhost python3.9[233845]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:30:38 localhost python3.9[233953]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:30:38 localhost nova_compute[230600]: 2025-11-23 09:30:38.979 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:38 localhost nova_compute[230600]: 2025-11-23 09:30:38.982 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:38 localhost nova_compute[230600]: 2025-11-23 09:30:38.983 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:38 localhost nova_compute[230600]: 2025-11-23 09:30:38.983 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:39 localhost nova_compute[230600]: 2025-11-23 09:30:39.037 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:39 localhost nova_compute[230600]: 2025-11-23 09:30:39.038 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30841 DF PROTO=TCP SPT=52784 DPT=9100 SEQ=1354231002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D07AA10000000001030307) Nov 23 04:30:40 localhost python3.9[234061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:41 localhost python3.9[234147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890240.1224487-741-223280229895433/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:41 localhost python3.9[234255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7668 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D084C70000000001030307) Nov 23 04:30:42 localhost python3.9[234310]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:42 localhost python3.9[234418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:43 localhost python3.9[234504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890242.2063427-741-17228611610544/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:43 localhost python3.9[234612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.038 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.040 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.040 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.041 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.058 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:44 localhost nova_compute[230600]: 2025-11-23 09:30:44.058 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:44 localhost python3.9[234698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890243.2827628-741-199279989865214/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:44 localhost python3.9[234806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:30:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7670 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D090E00000000001030307) Nov 23 04:30:45 localhost podman[234823]: 2025-11-23 09:30:45.01550288 +0000 UTC m=+0.070616555 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:30:45 localhost podman[234823]: 2025-11-23 09:30:45.079183393 +0000 UTC m=+0.134297048 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:30:45 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:30:45 localhost python3.9[234917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890244.4157846-741-226513149588775/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:45 localhost python3.9[235061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:46 localhost python3.9[235170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890245.5103056-741-104431102843691/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:46 localhost python3.9[235287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:47 localhost python3.9[235391]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890246.5791655-741-163880738108669/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:48 localhost python3.9[235499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:30:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7671 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0A0A00000000001030307) Nov 23 04:30:49 localhost podman[235586]: 2025-11-23 09:30:49.017158344 +0000 UTC m=+0.075152048 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:30:49 localhost podman[235586]: 2025-11-23 09:30:49.04801641 +0000 UTC m=+0.106010094 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.059 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.061 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.062 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.062 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:49 localhost python3.9[235585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890247.550197-741-98228040881482/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:49 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.088 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:49 localhost nova_compute[230600]: 2025-11-23 09:30:49.089 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:49 localhost python3.9[235711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.319 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.344 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.345 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.345 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.346 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:30:50 localhost nova_compute[230600]: 2025-11-23 09:30:50.432 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:30:50 localhost python3.9[235797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890249.1735647-741-121942688886372/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:51 localhost python3.9[235905]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:51 localhost python3.9[235991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890250.7956047-741-49126990267938/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:52 localhost python3.9[236099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:52 localhost python3.9[236185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890251.8765295-741-134149799912637/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:30:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59342 DF PROTO=TCP SPT=41054 DPT=9102 SEQ=3168736183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0B1E10000000001030307) Nov 23 04:30:53 localhost python3.9[236295]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.090 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.092 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.122 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:54 localhost nova_compute[230600]: 2025-11-23 09:30:54.123 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:54 localhost python3.9[236405]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:30:54 localhost systemd[1]: Reloading. Nov 23 04:30:54 localhost systemd-sysv-generator[236437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:30:54 localhost systemd-rc-local-generator[236431]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:30:54 localhost systemd[1]: Listening on Podman API Socket. Nov 23 04:30:55 localhost python3.9[236555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:56 localhost python3.9[236643]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:30:56 localhost python3.9[236698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:30:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7672 DF PROTO=TCP SPT=48684 DPT=9882 SEQ=2527218470 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0C0200000000001030307) Nov 23 04:30:57 localhost python3.9[236786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890255.2438393-1257-51817638897714/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:30:58 localhost python3.9[236896]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False Nov 23 04:30:59 localhost python3.9[237006]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.123 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.125 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.164 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:30:59 localhost nova_compute[230600]: 2025-11-23 09:30:59.164 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:30:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34932 DF PROTO=TCP SPT=53614 DPT=9101 SEQ=2178214657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0CACA0000000001030307) Nov 23 04:30:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:30:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37116 DF PROTO=TCP SPT=51694 DPT=9105 SEQ=2741287519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0CB7B0000000001030307) Nov 23 04:31:00 localhost systemd[1]: tmp-crun.Q8NkQD.mount: Deactivated successfully. Nov 23 04:31:00 localhost podman[237095]: 2025-11-23 09:31:00.035425374 +0000 UTC m=+0.085091281 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:31:00 localhost podman[237095]: 2025-11-23 09:31:00.043634424 +0000 UTC m=+0.093300321 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:31:00 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:31:01 localhost python3[237134]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:31:01 localhost python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5",#012 "Digest": "sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:23:50.144134741Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505196287,#012 "VirtualSize": 505196287,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0ef5cc2b89c8a41c643bbf27e239c40ba42d2785cdc67bc5e5d4e7b894568a96/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/3e0acb8482e457c209ba092ee475b7db4ab9004896d495d3f7437d3c12bacbd3/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:4ff7b15b3989ce3486d1ee120e82ba5b4acb5e4ad1d931e92c8d8e0851a32a6a",#012 "sha256:847ae301d478780c04ade872e138a0bd4b67a423f03bd51e3a177105d1684cb3"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Nov 23 04:31:01 localhost podman[237185]: 2025-11-23 09:31:01.377054291 +0000 UTC m=+0.081485808 container remove 6f17d877a16d33c5c53afe27745e454c98f8aab1068263cd17e32f69fc39c5b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1bd1f352f264f24512a1a2440e47a1f5'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 23 04:31:01 localhost python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Nov 23 04:31:01 localhost podman[237200]: Nov 23 04:31:01 localhost podman[237200]: 2025-11-23 09:31:01.476334281 +0000 UTC m=+0.080980742 container create db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 23 04:31:01 localhost podman[237200]: 2025-11-23 09:31:01.436407778 +0000 UTC m=+0.041054289 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 23 04:31:01 localhost python3[237134]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Nov 23 04:31:02 localhost python3.9[237347]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:31:02 localhost nova_compute[230600]: 2025-11-23 09:31:02.803 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:02 localhost nova_compute[230600]: 2025-11-23 09:31:02.804 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:02 localhost nova_compute[230600]: 2025-11-23 09:31:02.804 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:31:02 localhost nova_compute[230600]: 2025-11-23 09:31:02.805 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:31:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34934 DF PROTO=TCP SPT=53614 DPT=9101 SEQ=2178214657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0D6E00000000001030307) Nov 23 04:31:03 localhost python3.9[237459]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.165 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.167 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.167 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.168 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.205 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.206 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:04 localhost python3.9[237568]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890263.6718848-1449-268053809274375/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.583 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.583 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.584 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:31:04 localhost nova_compute[230600]: 2025-11-23 09:31:04.584 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:31:05 localhost python3.9[237623]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:31:05 localhost systemd[1]: Reloading. Nov 23 04:31:05 localhost systemd-sysv-generator[237647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:31:05 localhost systemd-rc-local-generator[237644]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.414 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.436 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.436 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.437 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.438 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.439 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.439 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.455 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.455 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.456 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.456 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.457 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.908 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:31:05 localhost python3.9[237734]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.967 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:31:05 localhost nova_compute[230600]: 2025-11-23 09:31:05.967 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.188 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.190 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12918MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.191 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.191 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.256 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.257 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.257 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:31:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5704 DF PROTO=TCP SPT=42040 DPT=9100 SEQ=722781209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0E4200000000001030307) Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.300 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.757 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.765 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.779 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.783 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:31:06 localhost nova_compute[230600]: 2025-11-23 09:31:06.783 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:31:07 localhost systemd[1]: Reloading. Nov 23 04:31:07 localhost systemd-rc-local-generator[237784]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:31:07 localhost systemd-sysv-generator[237792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:07 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 23 04:31:07 localhost systemd[1]: Started libcrun container. Nov 23 04:31:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Nov 23 04:31:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Nov 23 04:31:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:31:07 localhost podman[237799]: 2025-11-23 09:31:07.515093676 +0000 UTC m=+0.134545216 container init db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2) Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + sudo -E kolla_set_configs Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: sudo: unable to send audit message: Operation not permitted Nov 23 04:31:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:31:07 localhost podman[237799]: 2025-11-23 09:31:07.560832403 +0000 UTC m=+0.180283943 container start db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:31:07 localhost podman[237799]: ceilometer_agent_compute Nov 23 04:31:07 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Validating config file Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Copying service configuration files Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: INFO:__main__:Writing out command to execute Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: ++ cat /run_command Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + ARGS= Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + sudo kolla_copy_cacerts Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: sudo: unable to send audit message: Operation not permitted Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + [[ ! -n '' ]] Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + . kolla_extend_start Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + umask 0022 Nov 23 04:31:07 localhost ceilometer_agent_compute[237815]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Nov 23 04:31:07 localhost podman[237824]: 2025-11-23 09:31:07.646165631 +0000 UTC m=+0.081002702 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm) Nov 23 04:31:07 localhost podman[237824]: 2025-11-23 09:31:07.678280557 +0000 UTC m=+0.113117648 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:31:07 localhost podman[237824]: unhealthy Nov 23 04:31:07 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:31:07 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'. Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.333 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.334 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.335 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.336 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.337 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.338 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.339 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.340 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.343 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.344 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.345 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.346 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.347 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.348 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.349 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.350 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.351 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.352 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.353 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.353 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.369 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.370 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.371 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Nov 23 04:31:08 localhost python3.9[237954]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:31:08 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.476 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 23 04:31:08 localhost systemd[1]: tmp-crun.jsnnYj.mount: Deactivated successfully. Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.527 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.544 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.545 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.546 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.547 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.548 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.549 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.550 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.551 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.552 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.553 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.554 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.555 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.556 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.557 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.558 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.559 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.560 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.561 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.562 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.563 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.564 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.565 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.566 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.567 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.568 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.569 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.570 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.574 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.582 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.628 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.628 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.629 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12] Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.908 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ac70b06cd618b02a69e86ac9618a72b930eb6965a99ca4a3b2aa53408954f371" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.985 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:08 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cb876c5f-4670-4d11-adf5-6d047f5427a1 x-openstack-request-id: req-cb876c5f-4670-4d11-adf5-6d047f5427a1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.985 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.986 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-cb876c5f-4670-4d11-adf5-6d047f5427a1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 23 04:31:08 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:08.988 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ac70b06cd618b02a69e86ac9618a72b930eb6965a99ca4a3b2aa53408954f371" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.017 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-cdd5758b-0163-4591-9b16-c98144903c4f x-openstack-request-id: req-cdd5758b-0163-4591-9b16-c98144903c4f _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.018 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.018 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 used request id req-cdd5758b-0163-4591-9b16-c98144903c4f request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.019 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.020 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.045 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 50900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4e9a21-ef42-4d47-b853-1cd87ec3777f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50900000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:09.020415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2498e990-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.222336239, 'message_signature': 'b9cdb5db665f49bbbf44ac8c3837df11419845f5f901373de6beca57b483a6d1'}]}, 'timestamp': '2025-11-23 09:31:09.046249', '_unique_id': '578470e99f1f4f34a29ddd6e0e3300ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.053 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.060 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 355032bc-9946-4f6d-817c-2bfc8694d41d / tapd3912d14-a3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.061 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '218535e6-5649-4cb2-99fd-6a30185a1edc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.057324', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249b51bc-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'c1b1d27da66b313221b48ad408261b6d8802d0331edd22b8f299c1b9233fd649'}]}, 'timestamp': '2025-11-23 09:31:09.061857', '_unique_id': '756471c9fa964d2bb5e73717f4132aac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.062 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.078 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.078 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7682290-17ad-4c80-a6b4-4c2d85e8d209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.064495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '249de6a2-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'd2810731127dae29db35cb6cd5268b14bf0e1a57e91049328073afb8296da247'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.064495', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '249dfbd8-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '169a5f6cc346ce09b3e3684e85b760cc6c52d84641c7d2f35059e0136a08aafa'}]}, 'timestamp': '2025-11-23 09:31:09.079269', '_unique_id': '823944c1465e42e48d36f337b4fe754c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.080 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.081 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.082 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b60b8b-c2a2-47b4-9a35-79e94512b088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.081816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '249e7374-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'af11f5d88a60c144b4b3e126b0617dcd940010d924676453e1d48e45661624ba'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.081816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '249e8670-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': 'fd5204d1da74ebef498d0bd3ddf5c28895eb4c06b49c49e69a8735715690e020'}]}, 'timestamp': '2025-11-23 09:31:09.082804', '_unique_id': 'ea688908b5764be495b6746b97f76d00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.083 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.085 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.086 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7b7ff94-075e-44b7-81f8-9c7ee099289c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.086242', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249f1ff4-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'e978ec3fde2c7473ce6afc3bf44b82c9380b7b53b70bdced8bd4023980afdd58'}]}, 'timestamp': '2025-11-23 09:31:09.086770', '_unique_id': '4a5b544a72ae4a70a634c59a5a8d20db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.087 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.089 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f099576-e657-4a7f-bc6f-fef248c18d2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.089210', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '249f9376-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'cccce40001e07a8b87b4851f247f32461a000ce126b6e7002dd0db4c80e4422b'}]}, 'timestamp': '2025-11-23 09:31:09.089730', '_unique_id': 'ebd891b88f5a44ed883a051ea7824aca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.090 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.092 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5cd1ec1-127a-48cc-995a-628769d1348e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.092113', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a0040a-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '4aafd6b3db2673a6885b1264d171064f2aa404c495d56ffa72e6f1476a3d7ccf'}]}, 'timestamp': '2025-11-23 09:31:09.092634', '_unique_id': 'ee013968de36464783f05f72c6756ce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.093 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.133 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.134 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fc483f3-4fc9-4396-9daf-c415176d404e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.095301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a65b8e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '44243623ff2b7fb6295f2bbf5c0387e7f27118dc8e511b1500593088e88320d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.095301', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a67128-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'f15878028debb9cd33d1822ba21d79ac11bc1c384dd20e438f01f2e1270317f4'}]}, 'timestamp': '2025-11-23 09:31:09.134715', '_unique_id': 'b3a858d364ce489491d7e5a16d9bd06f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.136 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.137 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.138 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9c8f44e-41a2-45f0-a4d1-c9787b71e9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.138049', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a70840-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'e9223232e5e615820ee279bcd95f0c6c4652f39cfe17c6e90148e19888df66e7'}]}, 'timestamp': '2025-11-23 09:31:09.138626', '_unique_id': '5de347a5e6fc412780f2654ceaa95ec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.139 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.141 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.141 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9f6080d-3a0d-4e4a-9144-d40b32422e51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.141061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a77e10-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'e5e595e3b7d158103008740632aa91d35e8d589aba66fd892de3d8da650edc18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.141061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a7927e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'ea98e7057c89352e3164089fa15030cb319218c8035e576bf7bb7f24705a5584'}]}, 'timestamp': '2025-11-23 09:31:09.142121', '_unique_id': 'b51153059efb44bc828ddd425552f819'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.143 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.144 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.144 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7128677a-fd39-4e49-a988-2263210eb894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:09.144516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '24a80376-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.222336239, 'message_signature': '64b494babb94227c551e9d341841b3e6a7b158b6fc225082b93c3cf07bf5ccea'}]}, 'timestamp': '2025-11-23 09:31:09.145031', '_unique_id': '43631d88dd6746e1ab13c6f0b3ea589e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.146 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.147 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88ec5748-917d-4240-a69f-aa9362ccf5e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.147295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a8720c-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '729efd81428729aa8b80f8f5a8cf56b21d9e2fa532b714493778a7eae2fbbf0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.147295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a88526-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '78ff174cd10d1db703c64fe8ac1373e63be2c1ff63a898b7dd10a7f40b258e49'}]}, 'timestamp': '2025-11-23 09:31:09.148319', '_unique_id': '2eb1e3b71942448b971a934cefb12068'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.149 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.150 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f627099-2422-4a4f-b893-5c787bfd9098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.150638', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a8f236-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '6fc93ac0abf2bc804ea5689ddde04fc73b67610013f7b7d3039215c95e6dfae6'}]}, 'timestamp': '2025-11-23 09:31:09.151150', '_unique_id': 'bf951a26d5094dc99af65441dad65b0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.152 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.153 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2441eeff-50c1-460b-ba50-e1cc25088034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.153449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24a95fb4-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '825a821d8a907ae9f21acce873de2d69527e9fdb1c513808f621c05f3a04cca5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.153449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24a973e6-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '8f3afac8594924b3687b23d5b5b2a76ba2bf8fa0869741dc23b0063614b65f2c'}]}, 'timestamp': '2025-11-23 09:31:09.154471', '_unique_id': 'e983098db8f44b459d2dcd66edad8eb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.155 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.157 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e64fbdd7-200f-4df4-8164-d4c1e83b7f84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.157048', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24a9edbc-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '7e00dacb34b2158af772e47a59011159c7374c6b162ec35491d69f37c94929c1'}]}, 'timestamp': '2025-11-23 09:31:09.157594', '_unique_id': '8ad31f193bb841f0a30b0a890caa7829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.158 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.159 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd22bef6d-c85f-48b1-a91d-41c7af80a2c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.159934', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24aa5db0-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': 'a98524b19f26d413b691c287d47958bca7349d673b27552d9aedbe85de31e55c'}]}, 'timestamp': '2025-11-23 09:31:09.160427', '_unique_id': '281edaf4c0d841e8b5f02266b96a084f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.161 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.162 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.163 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8525dc5a-575c-4f54-b413-bd9d336a0c81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.162739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24aacc1e-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '730be7c8da68a290580906bbb9e398e66a4948acf251f589cd20368385f493c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.162739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24aadd80-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': 'c8b3750a5bebf6c1b00b3ff7542763844f0012871f41b88d2c38efdd571eea75'}]}, 'timestamp': '2025-11-23 09:31:09.163693', '_unique_id': '2fc222ad4ccc42e686bca401bb79f883'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.164 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.167 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9115f1a2-1bcf-49f9-9444-38a2886c8e0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.166989', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24ab7196-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '4f660c1915cd3ca021983258fc0fef6cc7875fbfd9228384b15652898a132d53'}]}, 'timestamp': '2025-11-23 09:31:09.167492', '_unique_id': 'f017e5c44f6647e2ac0bea0d31c593a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.168 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.170 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eaf0d7a-0ca0-4a72-848f-28a6d66bade8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:09.170039', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '24abe838-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.235123224, 'message_signature': '8cd7b07b24d579f6ff5075f4b4aa661833e9ab0b81b2e1cc22af1cd0a28632be'}]}, 'timestamp': '2025-11-23 09:31:09.170525', '_unique_id': '2cebef03a1e6489ca64b1c586197157e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.171 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.172 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bef5078-486a-400d-aca7-3917f1f7b2b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.172582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24ac47ce-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '78eb6a15a7fe52aea356ceeff0c612fa8a82881d104a91eb4892ccb03c879f3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.172582', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24ac52fa-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.273138876, 'message_signature': '44bfcc62cabef0026b8c59c4e48574554a0267e8f7a989529886e893283557f5'}]}, 'timestamp': '2025-11-23 09:31:09.173150', '_unique_id': '6fe1a339b59343c8bc639af994fa0b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.173 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.175 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3523519-57f5-46d5-ad83-89dc9e42a83f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:09.175416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '24acb754-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '5be757c87304cac83bc448ce3db90441f0a4632f651b3e36cab6eebcb493064a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:09.175416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '24acc208-c84f-11f0-87eb-fa163e72a351', 'monotonic_time': 10212.242312071, 'message_signature': '3b6afa3ff0c4e6367fe54d0e2879ea2b58118aa5255374144807c436b51846ef'}]}, 'timestamp': '2025-11-23 09:31:09.176011', '_unique_id': '6cb7a21b1884453bafed89f2120574b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.176 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:09 localhost journal[203731]: End of file while reading data: Input/output error Nov 23 04:31:09 localhost journal[203731]: End of file while reading data: Input/output error Nov 23 04:31:09 localhost ceilometer_agent_compute[237815]: 2025-11-23 09:31:09.190 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320 Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.207 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.209 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.210 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.210 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63975 DF PROTO=TCP SPT=51130 DPT=9100 SEQ=477593843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0EFA00000000001030307) Nov 23 04:31:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:31:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:31:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:31:09.241 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:31:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:31:09.243 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:31:09 localhost nova_compute[230600]: 2025-11-23 09:31:09.249 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:09 localhost systemd[1]: libpod-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Deactivated successfully. Nov 23 04:31:09 localhost podman[237961]: 2025-11-23 09:31:09.334793721 +0000 UTC m=+0.897766511 container died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm) Nov 23 04:31:09 localhost systemd[1]: libpod-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.scope: Consumed 1.303s CPU time. Nov 23 04:31:09 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.timer: Deactivated successfully. Nov 23 04:31:09 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:31:09 localhost systemd[1]: var-lib-containers-storage-overlay-6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743-merged.mount: Deactivated successfully. Nov 23 04:31:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44-userdata-shm.mount: Deactivated successfully. Nov 23 04:31:09 localhost podman[237961]: 2025-11-23 09:31:09.452420411 +0000 UTC m=+1.015393211 container cleanup db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 23 04:31:09 localhost podman[237961]: ceilometer_agent_compute Nov 23 04:31:09 localhost podman[237992]: 2025-11-23 09:31:09.597849229 +0000 UTC m=+0.108684188 container cleanup db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Nov 23 04:31:09 localhost podman[237992]: ceilometer_agent_compute Nov 23 04:31:09 localhost systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully. Nov 23 04:31:09 localhost systemd[1]: Stopped ceilometer_agent_compute container. Nov 23 04:31:09 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 23 04:31:09 localhost systemd[1]: Started libcrun container. Nov 23 04:31:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Nov 23 04:31:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd45e98ab54ef5b4508a5484d3cb4f2d74df282e366e84858b9d46d19d79743/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Nov 23 04:31:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:31:09 localhost podman[238004]: 2025-11-23 09:31:09.758374235 +0000 UTC m=+0.134158563 container init db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + sudo -E kolla_set_configs Nov 23 04:31:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: sudo: unable to send audit message: Operation not permitted Nov 23 04:31:09 localhost podman[238004]: 2025-11-23 09:31:09.799521147 +0000 UTC m=+0.175305485 container start db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 04:31:09 localhost podman[238004]: ceilometer_agent_compute Nov 23 04:31:09 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Validating config file Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Copying service configuration files Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: INFO:__main__:Writing out command to execute Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: ++ cat /run_command Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + ARGS= Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + sudo kolla_copy_cacerts Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: sudo: unable to send audit message: Operation not permitted Nov 23 04:31:09 localhost podman[238026]: 2025-11-23 09:31:09.885590718 +0000 UTC m=+0.082184610 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + [[ ! -n '' ]] Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + . kolla_extend_start Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + umask 0022 Nov 23 04:31:09 localhost ceilometer_agent_compute[238018]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Nov 23 04:31:09 localhost podman[238026]: 2025-11-23 09:31:09.915115092 +0000 UTC m=+0.111709014 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:31:09 localhost podman[238026]: unhealthy Nov 23 04:31:09 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Main process exited, code=exited, status=1/FAILURE Nov 23 04:31:09 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Failed with result 'exit-code'. Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.592 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.593 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.594 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.595 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.596 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.597 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.598 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.599 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.600 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.601 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.602 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.603 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.604 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.605 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.606 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.624 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.625 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.626 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.643 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.769 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.770 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.771 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.772 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.773 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.774 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.775 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.776 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.777 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.778 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.779 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.780 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.781 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.782 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.783 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.784 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.785 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.786 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.787 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.790 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Nov 23 04:31:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:10.798 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.139 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}4943fe3d2d0f6bc86f045500bb55c0a9758cf8f280687dae1712f3a359331ec1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:11 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f6956fff-fab4-465a-abff-146d12a09e04 x-openstack-request-id: req-f6956fff-fab4-465a-abff-146d12a09e04 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.190 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-f6956fff-fab4-465a-abff-146d12a09e04 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.192 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}4943fe3d2d0f6bc86f045500bb55c0a9758cf8f280687dae1712f3a359331ec1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sun, 23 Nov 2025 09:31:11 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-278583a1-3626-4d3a-aac2-3600da13fd29 x-openstack-request-id: req-278583a1-3626-4d3a-aac2-3600da13fd29 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "8c32de12-b44b-4285-8afc-2a1d7f236d32", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.205 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/8c32de12-b44b-4285-8afc-2a1d7f236d32 used request id req-278583a1-3626-4d3a-aac2-3600da13fd29 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.241 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.242 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d752762-388a-4476-9fe6-6a513a7d6d7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.207942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25e81bfe-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '792906acc776b15dea53ed642914e5c0b50eb02c4fe38c66bfb2e45884c30fff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.207942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25e836e8-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'ee1902992f5adc0102d1f06a03db6048390b1a05a5ba9fe2ced02116ebba5990'}]}, 'timestamp': '2025-11-23 09:31:11.243486', '_unique_id': '868f428b2a8b4b8a82eb4f06a639b112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.252 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.269 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.269 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd42c8a70-a277-46f6-8199-ab7df061edf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.256174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ec364e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '19ef297c3c5fae48f6031d62226604b290e60d01193dbacd1453c67d0c5e6d97'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.256174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ec49e0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': 'a6b05d07da2de1ad04ffacc00239b25d2f5b7198ea83cae09ef205abedddd8d2'}]}, 'timestamp': '2025-11-23 09:31:11.270157', '_unique_id': '4cd8393cee904d09b410cd68725160c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.271 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.272 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.273 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f57af3f8-1f00-484c-839e-ef4f5b3169fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.272841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ecc5d2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'cb0433bb4b666e924724de00b4508eb294a34f28cebdd9edbd65aee0c77ce451'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.272841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ecd5f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '87c42c7e59b4ff8092491a035fe3e1c449015a85fe3525e5e02dde5281871d04'}]}, 'timestamp': '2025-11-23 09:31:11.273730', '_unique_id': '6f4cdd4c79964b07b6024b80698cd5f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.274 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.279 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 355032bc-9946-4f6d-817c-2bfc8694d41d / tapd3912d14-a3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.279 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b9b19d5-a20c-4805-9a73-c9928b9b39f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.276138', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25edc536-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'ec6bb8b43f732d6cb4ccade4178446970367fc55e27b28735a7d798c6929c7ce'}]}, 'timestamp': '2025-11-23 09:31:11.279926', '_unique_id': '9293d911f8794825a2a84628e196043f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.280 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.282 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96968f39-306e-41eb-a9d0-3b66b2f61027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.282255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25ee33f4-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '2284765210958643908a6580dbf48a3015e3f84258935f7828b3f08857bbc9e1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.282255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25ee45b0-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '24113c3485b26f4b2fdbd94a6b5d364a94357e6612e63fd94181043721da0d26'}]}, 'timestamp': '2025-11-23 09:31:11.283183', '_unique_id': 'd703ccc50ffc48e2bb561fc135e355bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.284 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.285 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.285 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.286 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df11bfd-d474-419b-a9b1-5510637f7112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.286736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25eee4a2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '841f864d5672964a0da071ef3c6d00a9f3067cd16cd65d56c3e63eeb29f1a1ba'}]}, 'timestamp': '2025-11-23 09:31:11.287278', '_unique_id': '5ad1ed05ee814ef7994c14a801412048'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.288 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.289 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88e0df29-e9c3-4313-b948-3c3984fac4ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.289545', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25ef50cc-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'aabb3e636f1cec66aecf5e0d62d74e8bb77758ed027ab349b92869ec678f7014'}]}, 'timestamp': '2025-11-23 09:31:11.290051', '_unique_id': '3ed680710dae4338901e945e0f489403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.290 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.292 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6caa6b8-a8fe-43e2-8f87-2192c62470ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.292777', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25efd038-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '5668a565b3d5124a3b584e004b9065f2a6a2bd0ec3fdaf59995a28f63871390c'}]}, 'timestamp': '2025-11-23 09:31:11.293277', '_unique_id': 'bcd1cc1dcaf046f59f739318efb80a15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.294 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.296 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd448daf8-297a-450c-b4ad-c59165fc5d6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.295997', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f04cfc-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'bf396501856c65a621c0adffba0246641a96684dadc9d24c2e10651372157763'}]}, 'timestamp': '2025-11-23 09:31:11.296472', '_unique_id': '87aecc968c46486993381819e76a03ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.297 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.298 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.317 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 50920000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daed470a-8efc-41f2-b4f1-7f44b09ca862', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50920000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:11.298737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '25f38ce6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.494408789, 'message_signature': 'd56263cba6d3b88705bb463b6c1a6c324b6f7d14ef0914a95ff0d3d603474753'}]}, 'timestamp': '2025-11-23 09:31:11.317774', '_unique_id': '0d36c66607be43ee8c2a286f48baea17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.318 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.320 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7979a024-fb01-4deb-a40b-a19568c79610', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.320379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f4057c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '78c5a77f933751756f4eedf5fda598b95364869382a9e2064ef0c57959f0540a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.320379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f4176a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'e74580ff965d57ac3c9be3443336bcfb76cf92c596aa679787ea6e7d7daa1e8f'}]}, 'timestamp': '2025-11-23 09:31:11.321276', '_unique_id': 'ea390a3a7cfc49d4932ba0000f829b3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.322 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.323 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c49c4d1-5bad-4fe8-abe0-8148c6010cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.323535', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f480a6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'bf1fb8fab02ad2cfd213b9424f49b93b996248c29918dc9807b61dca3d78fbcf'}]}, 'timestamp': '2025-11-23 09:31:11.324034', '_unique_id': 'b04abe0dec9145159ef9d1ca756a50ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.324 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.326 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b200d2f-599f-49be-9cf8-8f49aa6f4bdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.326183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f4e7b2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '2c2b094e932177539f3278789f83d16b667fe1cda4a6e6803fbcdbe165df4627'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.326183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f4f7ca-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'd0f04e11fe1dbb73762c61e5e4391b8fdfa093f5ce2d885fe6556d7098b752f7'}]}, 'timestamp': '2025-11-23 09:31:11.327068', '_unique_id': 'b62d86c537c84c0ab9fb14ffb9aaa461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.328 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.329 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.330 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72586825-1252-425a-b056-68b381651c56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.329803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f57678-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '9d50ec48ce5ebb9620cb9dd84a7583b83bcdb2aad993c851bbcf5f86cf5f0046'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.329803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f58834-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '5e347d7a21698e2acfd5efa1b1dcfdcbe5038df26b2b4323ec9e59226e4a4a6a'}]}, 'timestamp': '2025-11-23 09:31:11.330718', '_unique_id': '26a7e8b199394657b900005e0586b0fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.331 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.333 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d18161d-6d89-41ea-9392-48e8d56eca28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.333013', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f5f2f6-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'f5d53cc6d7455c0ac0c5e975470e442d0b93648bbbc20f304632d47aa1d9aa68'}]}, 'timestamp': '2025-11-23 09:31:11.333484', '_unique_id': 'e1f034a9ebd0431aaed53fee141b84a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.334 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.335 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.335 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.336 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af0cb322-a430-4fc5-a2fd-387640978536', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.335667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f65bba-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': '2bd7ff7d1eb07d10a217937a53717a956a01bbe5cbcb895f72848f944013bbf1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.335667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f66c36-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.433970328, 'message_signature': 'ec4b3b886ed880ce5b0eab1f31c6bc1673f1e856e36733abb26021c1a469daa7'}]}, 'timestamp': '2025-11-23 09:31:11.336556', '_unique_id': '574b6669eb6a424189be3d03fe557455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.337 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.338 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '558f20e1-51fe-417d-8ed9-5dfaad043dc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.338750', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f6d61c-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'a65aba3040e9165b3672b36f6ba86c871c81cb507e0641948bc70d25c984692c'}]}, 'timestamp': '2025-11-23 09:31:11.339411', '_unique_id': '7253ab9b47c0435b858666a5a9ee281a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.340 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.342 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.342 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73d6358e-456d-4e08-b2a0-51f4e6015ea0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:31:11.341992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25f753b2-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': 'dbe98f266ead22e858af9be52c6e06abfec6167e331da65eea9adfd39fbf905d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:31:11.341992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '25f76442-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.385676201, 'message_signature': '8ff575bc49a5bb613c762618603fb4a78976ecfc60e7a4ca20294a49afabac3a'}]}, 'timestamp': '2025-11-23 09:31:11.342931', '_unique_id': '2a6b8c74152d4e928a1da46d7cbf6ed6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.343 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.345 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.345 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3bc89d1-c744-4cc9-8ca9-b6a81caa77f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.345239', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f7d04e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': 'a16befeed15889d875ee29b95f498b82a34d16c82a444230427d9f3a4398f434'}]}, 'timestamp': '2025-11-23 09:31:11.345699', '_unique_id': '53f3a97f857f498592ae1801360c8427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.346 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.347 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.347 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b84d08e-40cb-4947-b668-35a3ed83a6b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:31:11.347641', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '25f82a9e-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.494408789, 'message_signature': 'a21386dfdc3ff2a74c3c057c06f5fcc15efe4c17663a3d2229f0b9940436f637'}]}, 'timestamp': '2025-11-23 09:31:11.347932', '_unique_id': '3383384e18834abe90462bef2b74adbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.348 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.349 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1772a36d-f81e-4a40-b1ed-ff604ffb6e87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:31:11.349468', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '25f8721a-c84f-11f0-bde4-fa163e72a351', 'monotonic_time': 10214.453899409, 'message_signature': '720d7a3f90fd0c296b88e3010b0ee23f07ff6b9c556f9fc1306b08fbe5f4b722'}]}, 'timestamp': '2025-11-23 09:31:11.349753', '_unique_id': 'bb0187783ce8424fa3e90eef2299e7b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:31:11 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:31:11.350 12 ERROR oslo_messaging.notify.messaging Nov 23 04:31:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24354 DF PROTO=TCP SPT=50984 DPT=9882 SEQ=2932324281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D0F9F60000000001030307) Nov 23 04:31:12 localhost python3.9[238165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:31:12 localhost python3.9[238253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890271.752434-1545-15539360041088/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 23 04:31:14 localhost python3.9[238363]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False Nov 23 04:31:14 localhost nova_compute[230600]: 2025-11-23 09:31:14.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:31:14 localhost python3.9[238473]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:31:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24356 DF PROTO=TCP SPT=50984 DPT=9882 SEQ=2932324281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D105E00000000001030307) Nov 23 04:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:31:16 localhost podman[238491]: 2025-11-23 09:31:16.052960667 +0000 UTC m=+0.107341663 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 23 04:31:16 localhost podman[238491]: 2025-11-23 09:31:16.09453645 +0000 UTC m=+0.148917396 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:31:16 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:31:16 localhost python3[238608]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:31:16 localhost podman[238643]: Nov 23 04:31:16 localhost podman[238643]: 2025-11-23 09:31:16.837665148 +0000 UTC m=+0.078589733 container create a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors ) Nov 23 04:31:16 localhost podman[238643]: 2025-11-23 09:31:16.797767158 +0000 UTC m=+0.038691773 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Nov 23 04:31:16 localhost python3[238608]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl Nov 23 04:31:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59344 DF PROTO=TCP SPT=41054 DPT=9102 SEQ=3168736183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D112200000000001030307) Nov 23 04:31:18 localhost python3.9[238792]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.253 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.254 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.255 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.255 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.283 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:31:19 localhost nova_compute[230600]: 2025-11-23 09:31:19.284 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:31:19 localhost systemd[1]: tmp-crun.4TdPZz.mount: Deactivated successfully. Nov 23 04:31:19 localhost podman[238904]: 2025-11-23 09:31:19.524343999 +0000 UTC m=+0.097027160 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent) Nov 23 04:31:19 localhost podman[238904]: 2025-11-23 09:31:19.562773483 +0000 UTC m=+0.135456624 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent) Nov 23 04:31:19 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:31:19 localhost python3.9[238905]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:31:20 localhost python3.9[239031]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890279.6884835-1704-280128566221415/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:31:20 localhost python3.9[239086]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:31:20 localhost systemd[1]: Reloading. Nov 23 04:31:20 localhost systemd-rc-local-generator[239107]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:31:20 localhost systemd-sysv-generator[239113]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:21 localhost python3.9[239176]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:31:21 localhost systemd[1]: Reloading. Nov 23 04:31:21 localhost systemd-sysv-generator[239205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:31:21 localhost systemd-rc-local-generator[239202]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:31:22 localhost systemd[1]: Starting node_exporter container... Nov 23 04:31:22 localhost systemd[1]: Started libcrun container. Nov 23 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:31:22 localhost podman[239216]: 2025-11-23 09:31:22.355258309 +0000 UTC m=+0.132545594 container init a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.373Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.374Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=arp Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=bcache Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=bonding Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=btrfs Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=conntrack Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=cpu Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=cpufreq Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=diskstats Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=edac Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=fibrechannel Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=filefd Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=filesystem Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=infiniband Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=ipvs Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=loadavg Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=mdadm Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=meminfo Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netclass Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netdev Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=netstat Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nfs Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nfsd Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=nvme Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=schedstat Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=sockstat Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.375Z caller=node_exporter.go:117 level=info collector=softnet Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=systemd Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=tapestats Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=udp_queues Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=vmstat Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=xfs Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=node_exporter.go:117 level=info collector=zfs Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Nov 23 04:31:22 localhost node_exporter[239229]: ts=2025-11-23T09:31:22.376Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Nov 23 04:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:31:22 localhost podman[239216]: 2025-11-23 09:31:22.383475623 +0000 UTC m=+0.160762898 container start a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:31:22 localhost podman[239216]: node_exporter Nov 23 04:31:22 localhost systemd[1]: Started node_exporter container. Nov 23 04:31:22 localhost podman[239239]: 2025-11-23 09:31:22.475459154 +0000 UTC m=+0.085391746 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:31:22 localhost podman[239239]: 2025-11-23 09:31:22.489223945 +0000 UTC m=+0.099156537 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:31:22 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:31:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36411 DF PROTO=TCP SPT=35588 DPT=9102 SEQ=2246745688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D127200000000001030307) Nov 23 04:31:24 localhost python3.9[239371]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:31:24 localhost systemd[1]: Stopping node_exporter container... Nov 23 04:31:24 localhost systemd[1]: tmp-crun.SUbz43.mount: Deactivated successfully. Nov 23 04:31:24 localhost podman[239375]: 2025-11-23 09:31:24.283250223 +0000 UTC m=+0.058388961 container died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:31:24 localhost systemd[1]: libpod-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.scope: Deactivated successfully. Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.285 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.287 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.288 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.288 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.314 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:31:24 localhost nova_compute[230600]: 2025-11-23 09:31:24.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:31:24 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.timer: Deactivated successfully. Nov 23 04:31:24 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:31:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9-userdata-shm.mount: Deactivated successfully. Nov 23 04:31:24 localhost systemd[1]: var-lib-containers-storage-overlay-550e0a847b911bdcbe7aa7c90ef7335ef7fd8eecced200d056453fa3475177c6-merged.mount: Deactivated successfully. Nov 23 04:31:24 localhost podman[239375]: 2025-11-23 09:31:24.416771726 +0000 UTC m=+0.191910434 container cleanup a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:31:24 localhost podman[239375]: node_exporter Nov 23 04:31:24 localhost systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Nov 23 04:31:24 localhost podman[239402]: 2025-11-23 09:31:24.513507815 +0000 UTC m=+0.066044589 container cleanup a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:31:24 localhost podman[239402]: node_exporter Nov 23 04:31:24 localhost systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'. Nov 23 04:31:24 localhost systemd[1]: Stopped node_exporter container. Nov 23 04:31:24 localhost systemd[1]: Starting node_exporter container... Nov 23 04:31:24 localhost systemd[1]: Started libcrun container. Nov 23 04:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:31:24 localhost podman[239412]: 2025-11-23 09:31:24.67712506 +0000 UTC m=+0.131660405 container init a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.145 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.147 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:11 localhost podman[258343]: 2025-11-23 09:38:11.159408909 +0000 UTC m=+0.129154341 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.216 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.217 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:38:11 localhost podman[258343]: 2025-11-23 09:38:11.26825875 +0000 UTC m=+0.238004162 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55) Nov 23 04:38:11 localhost rsyslogd[760]: imjournal: 6901 messages lost due to rate-limiting (20000 allowed within 600 seconds) Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.430 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.432 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12194MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.433 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.433 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.490 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.490 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.491 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:38:11 localhost nova_compute[230600]: 2025-11-23 09:38:11.531 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:38:11 localhost podman[240668]: time="2025-11-23T09:38:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:38:11 localhost podman[240668]: @ - - [23/Nov/2025:09:38:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1" Nov 23 04:38:11 localhost podman[240668]: @ - - [23/Nov/2025:09:38:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16781 "" "Go-http-client/1.1" Nov 23 04:38:12 localhost nova_compute[230600]: 2025-11-23 09:38:12.025 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:38:12 localhost nova_compute[230600]: 2025-11-23 09:38:12.035 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:38:12 localhost nova_compute[230600]: 2025-11-23 09:38:12.046 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:38:12 localhost nova_compute[230600]: 2025-11-23 09:38:12.048 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:38:12 localhost nova_compute[230600]: 2025-11-23 09:38:12.048 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:38:12 localhost python3.9[258611]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:38:12 localhost podman[258630]: 2025-11-23 09:38:12.941814962 +0000 UTC m=+0.068271016 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 04:38:12 localhost podman[258630]: 2025-11-23 09:38:12.97472523 +0000 UTC m=+0.101181244 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:38:12 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.125 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.126 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.157 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.157 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:38:13 localhost python3.9[258740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890692.355187-498-158147993971947/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=1067e04911e84d9dc262158a63dd8e464b0e5dfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:38:13 localhost nova_compute[230600]: 2025-11-23 09:38:13.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:38:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:38:14 localhost podman[258796]: 2025-11-23 09:38:14.017990322 +0000 UTC m=+0.075523611 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 23 04:38:14 localhost podman[258796]: 2025-11-23 09:38:14.035295377 +0000 UTC m=+0.092828626 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41) Nov 23 04:38:14 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:38:14 localhost python3.9[258868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:14 localhost python3.9[258954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890693.5041487-543-89770355955367/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:15 localhost python3.9[259062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:15 localhost python3.9[259148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890695.0738232-543-211759360281348/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.150 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.202 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:16 localhost nova_compute[230600]: 2025-11-23 09:38:16.203 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49020 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D773DB0000000001030307) Nov 23 04:38:16 localhost python3.9[259256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:17 localhost python3.9[259311]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:38:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49021 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D777E00000000001030307) Nov 23 04:38:17 localhost podman[259312]: 2025-11-23 09:38:17.350078747 +0000 UTC m=+0.076962574 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:38:17 localhost podman[259312]: 2025-11-23 09:38:17.360229052 +0000 UTC m=+0.087112949 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:38:17 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:38:17 localhost python3.9[259437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21261 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D77A200000000001030307) Nov 23 04:38:18 localhost python3.9[259523]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890697.382037-630-187159174412963/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49022 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D77FE00000000001030307) Nov 23 04:38:19 localhost python3.9[259631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:38:20 localhost podman[259724]: 2025-11-23 09:38:20.021839293 +0000 UTC m=+0.073577769 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:38:20 localhost podman[259724]: 2025-11-23 09:38:20.05853604 +0000 UTC m=+0.110274496 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:38:20 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:38:20 localhost python3.9[259754]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42265 DF PROTO=TCP SPT=59762 DPT=9102 SEQ=4153560362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D784200000000001030307) Nov 23 04:38:20 localhost python3.9[259876]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:38:21 localhost podman[259934]: 2025-11-23 09:38:21.178335301 +0000 UTC m=+0.078701279 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:38:21 localhost podman[259934]: 2025-11-23 09:38:21.192220661 +0000 UTC m=+0.092586619 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:38:21 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.244 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.246 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.246 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.247 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.248 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.249 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:21 localhost nova_compute[230600]: 2025-11-23 09:38:21.252 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:21 localhost python3.9[259933]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:21 localhost python3.9[260063]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:22 localhost python3.9[260120]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:23 localhost python3.9[260230]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49023 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D78FA00000000001030307) Nov 23 04:38:23 localhost python3.9[260340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:24 localhost python3.9[260397]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:25 localhost python3.9[260507]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.278 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.279 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.281 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.282 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:26 localhost nova_compute[230600]: 2025-11-23 09:38:26.284 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:26 localhost python3.9[260564]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:27 localhost python3.9[260674]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:38:27 localhost systemd[1]: Reloading. Nov 23 04:38:27 localhost systemd-rc-local-generator[260701]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:38:27 localhost systemd-sysv-generator[260705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:28 localhost python3.9[260822]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:28 localhost python3.9[260879]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:29 localhost python3.9[260989]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:29 localhost python3.9[261046]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:29 localhost openstack_network_exporter[242668]: ERROR 09:38:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:38:29 localhost openstack_network_exporter[242668]: ERROR 09:38:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:38:29 localhost openstack_network_exporter[242668]: ERROR 09:38:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:38:29 localhost openstack_network_exporter[242668]: ERROR 09:38:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:38:29 localhost openstack_network_exporter[242668]: Nov 23 04:38:29 localhost openstack_network_exporter[242668]: ERROR 09:38:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:38:29 localhost openstack_network_exporter[242668]: Nov 23 04:38:30 localhost python3.9[261156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:38:30 localhost systemd[1]: Reloading. Nov 23 04:38:30 localhost systemd-rc-local-generator[261183]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:38:30 localhost systemd-sysv-generator[261186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:31 localhost systemd[1]: Starting Create netns directory... Nov 23 04:38:31 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:38:31 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:38:31 localhost systemd[1]: Finished Create netns directory. Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.313 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.315 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.316 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:31 localhost nova_compute[230600]: 2025-11-23 09:38:31.319 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49024 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7B0200000000001030307) Nov 23 04:38:31 localhost python3.9[261308]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:38:32 localhost python3.9[261418]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:38:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:38:33 localhost podman[261473]: 2025-11-23 09:38:33.045963225 +0000 UTC m=+0.093949940 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:38:33 localhost podman[261473]: 2025-11-23 09:38:33.060294909 +0000 UTC m=+0.108281664 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:38:33 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:38:33 localhost python3.9[261525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763890712.177706-1074-151086875573007/.source.json _original_basename=.gi1o4vau follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:33 localhost python3.9[261635]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:36 localhost python3.9[261943]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.352 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.353 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.354 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.354 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.355 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:36 localhost nova_compute[230600]: 2025-11-23 09:38:36.357 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:37 localhost python3.9[262053]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:38:38 localhost python3.9[262163]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:38:39 localhost podman[262209]: 2025-11-23 09:38:39.032543213 +0000 UTC m=+0.081802915 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:38:39 localhost podman[262209]: 2025-11-23 09:38:39.070396525 +0000 UTC m=+0.119656287 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:38:39 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.358 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.359 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.382 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:41 localhost nova_compute[230600]: 2025-11-23 09:38:41.383 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:41 localhost podman[240668]: time="2025-11-23T09:38:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:38:41 localhost podman[240668]: @ - - [23/Nov/2025:09:38:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1" Nov 23 04:38:41 localhost podman[240668]: @ - - [23/Nov/2025:09:38:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16777 "" "Go-http-client/1.1" Nov 23 04:38:43 localhost python3[262324]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:38:43 localhost podman[262361]: Nov 23 04:38:43 localhost podman[262361]: 2025-11-23 09:38:43.409647623 +0000 UTC m=+0.075481178 container create a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, tcib_managed=true) Nov 23 04:38:43 localhost podman[262361]: 2025-11-23 09:38:43.37822843 +0000 UTC m=+0.044062005 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:38:43 localhost python3[262324]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:38:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:38:44 localhost podman[262474]: 2025-11-23 09:38:44.041167291 +0000 UTC m=+0.086094467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:38:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:38:44 localhost podman[262474]: 2025-11-23 09:38:44.089303162 +0000 UTC m=+0.134230368 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:38:44 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:38:44 localhost podman[262534]: 2025-11-23 09:38:44.164441529 +0000 UTC m=+0.078216783 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 04:38:44 localhost podman[262534]: 2025-11-23 09:38:44.177231786 +0000 UTC m=+0.091007020 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Nov 23 04:38:44 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:38:44 localhost python3.9[262533]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:38:45 localhost python3.9[262664]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:45 localhost python3.9[262719]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:38:46 localhost python3.9[262828]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890725.5548944-1338-227240907472375/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:38:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12128 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7E90B0000000001030307) Nov 23 04:38:46 localhost nova_compute[230600]: 2025-11-23 09:38:46.382 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:46 localhost python3.9[262883]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:38:46 localhost systemd[1]: Reloading. Nov 23 04:38:46 localhost systemd-rc-local-generator[262904]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:38:46 localhost systemd-sysv-generator[262908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12129 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7ED200000000001030307) Nov 23 04:38:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:38:47 localhost podman[262974]: 2025-11-23 09:38:47.582100636 +0000 UTC m=+0.086284094 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:38:47 localhost podman[262974]: 2025-11-23 09:38:47.615196741 +0000 UTC m=+0.119380159 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent) Nov 23 04:38:47 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:38:47 localhost python3.9[262973]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:38:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49025 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F0210000000001030307) Nov 23 04:38:48 localhost systemd[1]: Reloading. Nov 23 04:38:48 localhost systemd-rc-local-generator[263018]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:38:48 localhost systemd-sysv-generator[263023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:38:49 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 23 04:38:49 localhost systemd[1]: Started libcrun container. Nov 23 04:38:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 23 04:38:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:38:49 localhost podman[263032]: 2025-11-23 09:38:49.319439291 +0000 UTC m=+0.107857911 container init a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:38:49 localhost podman[263032]: 2025-11-23 09:38:49.330440982 +0000 UTC m=+0.118859602 container start a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 04:38:49 localhost podman[263032]: neutron_dhcp_agent Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + sudo -E kolla_set_configs Nov 23 04:38:49 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 23 04:38:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12130 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F5200000000001030307) Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Validating config file Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Copying service configuration files Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Writing out command to execute Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: ++ cat /run_command Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + CMD=/usr/bin/neutron-dhcp-agent Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + ARGS= Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + sudo kolla_copy_cacerts Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + [[ ! -n '' ]] Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + . kolla_extend_start Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + umask 0022 Nov 23 04:38:49 localhost neutron_dhcp_agent[263045]: + exec /usr/bin/neutron-dhcp-agent Nov 23 04:38:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21262 DF PROTO=TCP SPT=60898 DPT=9102 SEQ=166416658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D7F8200000000001030307) Nov 23 04:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:38:50 localhost python3.9[263168]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:38:50 localhost systemd[1]: Stopping neutron_dhcp_agent container... Nov 23 04:38:50 localhost podman[263170]: 2025-11-23 09:38:50.306737289 +0000 UTC m=+0.102947180 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:38:50 localhost podman[263170]: 2025-11-23 09:38:50.318625937 +0000 UTC m=+0.114835878 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:38:50 localhost systemd[1]: libpod-a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8.scope: Deactivated successfully. Nov 23 04:38:50 localhost systemd[1]: libpod-a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8.scope: Consumed 1.009s CPU time. Nov 23 04:38:50 localhost podman[263184]: 2025-11-23 09:38:50.352118584 +0000 UTC m=+0.076864652 container died a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:38:50 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:38:50 localhost podman[263184]: 2025-11-23 09:38:50.453924737 +0000 UTC m=+0.178670805 container cleanup a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:38:50 localhost podman[263184]: neutron_dhcp_agent Nov 23 04:38:50 localhost podman[263237]: error opening file `/run/crun/a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8/status`: No such file or directory Nov 23 04:38:50 localhost podman[263226]: 2025-11-23 09:38:50.558079642 +0000 UTC m=+0.069925346 container cleanup a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:38:50 localhost podman[263226]: neutron_dhcp_agent Nov 23 04:38:50 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Nov 23 04:38:50 localhost systemd[1]: Stopped neutron_dhcp_agent container. Nov 23 04:38:50 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 23 04:38:50 localhost systemd[1]: Started libcrun container. Nov 23 04:38:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 23 04:38:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f689f0d89e86dd0cee7a591d316899d1c1d60a1e8f6d378c0e77ba2c60798109/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:38:50 localhost podman[263239]: 2025-11-23 09:38:50.697219212 +0000 UTC m=+0.107761648 container init a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:38:50 localhost podman[263239]: 2025-11-23 09:38:50.706383205 +0000 UTC m=+0.116925641 container start a819bf5bbc0de8d34b1427aeebf447054d4107ce296e9b308148f4591d1accf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'e951eecc131f635fede2f85083d62ee4ac6e8aeec62c68cdfc2585440b13fd8d'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=neutron_dhcp_agent) Nov 23 04:38:50 localhost podman[263239]: neutron_dhcp_agent Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + sudo -E kolla_set_configs Nov 23 04:38:50 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Validating config file Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Copying service configuration files Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Writing out command to execute Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c8b37597a29e61f341a0e3f5416437aac1a5cd21cb3a407dd674c7a7a1ff41da Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: ++ cat /run_command Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + CMD=/usr/bin/neutron-dhcp-agent Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + ARGS= Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + sudo kolla_copy_cacerts Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + [[ ! -n '' ]] Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + . kolla_extend_start Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + umask 0022 Nov 23 04:38:50 localhost neutron_dhcp_agent[263254]: + exec /usr/bin/neutron-dhcp-agent Nov 23 04:38:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:38:51 localhost podman[263286]: 2025-11-23 09:38:51.373436654 +0000 UTC m=+0.080644308 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:38:51 localhost podman[263286]: 2025-11-23 09:38:51.384008162 +0000 UTC m=+0.091215796 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.384 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.386 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:51 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:51 localhost nova_compute[230600]: 2025-11-23 09:38:51.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:38:51 localhost systemd[1]: session-58.scope: Deactivated successfully. Nov 23 04:38:51 localhost systemd[1]: session-58.scope: Consumed 33.945s CPU time. Nov 23 04:38:51 localhost systemd-logind[761]: Session 58 logged out. Waiting for processes to exit. Nov 23 04:38:51 localhost systemd-logind[761]: Removed session 58. Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.020 263258 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.020 263258 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.407 263258 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.843 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.844 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] Synchronizing state complete#033[00m Nov 23 04:38:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:38:52.884 263258 INFO neutron.agent.dhcp.agent [None req-bae6fc77-6091-481a-9d57-864431681128 - - - - - -] DHCP agent started#033[00m Nov 23 04:38:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12131 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D804E10000000001030307) Nov 23 04:38:53 localhost nova_compute[230600]: 2025-11-23 09:38:53.661 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:38:53.661 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:38:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:38:53.663 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:38:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:38:53.663 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:38:56 localhost nova_compute[230600]: 2025-11-23 09:38:56.449 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:38:59 localhost openstack_network_exporter[242668]: ERROR 09:38:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:38:59 localhost openstack_network_exporter[242668]: ERROR 09:38:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:38:59 localhost openstack_network_exporter[242668]: ERROR 09:38:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:39:00 localhost openstack_network_exporter[242668]: ERROR 09:38:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:39:00 localhost openstack_network_exporter[242668]: Nov 23 04:39:00 localhost openstack_network_exporter[242668]: ERROR 09:39:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:39:00 localhost openstack_network_exporter[242668]: Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.486 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.488 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.490 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.490 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:01 localhost nova_compute[230600]: 2025-11-23 09:39:01.493 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12132 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D826200000000001030307) Nov 23 04:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:39:04 localhost podman[263305]: 2025-11-23 09:39:04.04701546 +0000 UTC m=+0.103871668 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:39:04 localhost podman[263305]: 2025-11-23 09:39:04.061260722 +0000 UTC m=+0.118116940 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 04:39:04 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.525 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.527 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.528 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.529 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:06 localhost nova_compute[230600]: 2025-11-23 09:39:06.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.739 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.739 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:39:07 localhost nova_compute[230600]: 2025-11-23 09:39:07.740 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.167 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.229 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.229 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.451 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.453 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12131MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.453 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.454 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.545 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:39:08 localhost nova_compute[230600]: 2025-11-23 09:39:08.586 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:39:09 localhost nova_compute[230600]: 2025-11-23 09:39:09.007 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:39:09 localhost nova_compute[230600]: 2025-11-23 09:39:09.013 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:39:09 localhost nova_compute[230600]: 2025-11-23 09:39:09.028 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:39:09 localhost nova_compute[230600]: 2025-11-23 09:39:09.030 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:39:09 localhost nova_compute[230600]: 2025-11-23 09:39:09.030 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:39:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:39:09.250 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:39:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:39:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:39:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:39:09.252 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:39:10 localhost systemd[1]: tmp-crun.XaNqJA.mount: Deactivated successfully. Nov 23 04:39:10 localhost podman[263368]: 2025-11-23 09:39:10.026578589 +0000 UTC m=+0.083442325 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:39:10 localhost podman[263368]: 2025-11-23 09:39:10.039341374 +0000 UTC m=+0.096205100 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:39:10 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.802 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.817 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0878f08-4222-4816-b4bb-d9d20b6659ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.803666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c17d04-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '4962218054170a022a62943110517082a2f7b853f7c80a93b7cf61e4424540c4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.803666', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c1929e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'beb77bb0339223dca0baf979821f49b8ecd4aa9f1426218bace13103bc750937'}]}, 'timestamp': '2025-11-23 09:39:10.818596', '_unique_id': '64253285b2a74dadb1a8568f709155a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.821 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44b7c9b4-0261-404c-a1a9-08237a0e1d11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.821547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c217be-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '5a7eb301175b68256ef30bee7de54e2bf37266f2db4d81662f201e3af481663d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.821547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c22998-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': '6a347af00373fdb936e7a020fbd5ab113ad4bb6a10d8d68811f4b92137563332'}]}, 'timestamp': '2025-11-23 09:39:10.822474', '_unique_id': 'de4a7bf0be17439e93f314477b64aadc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.828 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a0c0eb5-8e35-41e3-b9ba-230c81e1ada0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.824778', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43c31f42-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'a62cba49d08af3e1c673e6eeebe18f4af8afaf1254b7e580e7edbbccf7bd170a'}]}, 'timestamp': '2025-11-23 09:39:10.828796', '_unique_id': 'a943ba3ff28e4080919379e8b8ee1ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.830 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.831 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df2a77d4-12a9-416d-9abf-02f02436d7ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.831235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c39292-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'a7a5cf900839f6c62cf8c9d6205921c2a7d22b26ece3afe929415e495a6bb01a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.831235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c3a6f6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10693.981297738, 'message_signature': 'c9dd2ebc7da0e4b18a6d301910ee9eefe4aa0dd45d762ebdda5853bcf2c91a79'}]}, 'timestamp': '2025-11-23 09:39:10.832260', '_unique_id': '79cd07a1a12b4ff2a19ec30bfabf987f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.833 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.834 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3848c7b-5e0d-4f50-a6d2-a285949e06e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.834608', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43c41668-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '5ade14051555de8f046039e6964670c4da7871decfca98c03014e24c4a3ec73b'}]}, 'timestamp': '2025-11-23 09:39:10.835146', '_unique_id': 'e4d5f01a08374dffb386e64699eef793'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f50a8d52-c7ff-42bf-b829-6d3d4e8032f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.837383', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43c9ed36-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'cc49b6461dd594ca63eb64244a5f3c389654e0915378482ba838282e41f23006'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.837383', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43c9feb6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '89ef3c60056aa6d83c39b1c95a2d80ac1df5cbda5e676834aa7c433c18116742'}]}, 'timestamp': '2025-11-23 09:39:10.873775', '_unique_id': 'a1e9eaaf356d4b95bd1dc2dc72130b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e1ceecf-8180-4b83-9c32-9a1910f9754b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.876154', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43ca6ce8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '36600c616d87bc8f9baa24efa79d5ef684355d8f3e1759aeefd8d57be5dc8563'}]}, 'timestamp': '2025-11-23 09:39:10.876623', '_unique_id': 'd9c9c26ece3d4438bb9036057ee17c84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.877 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e0d3a44-dcdd-43be-ad8c-a245153fb8ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.878754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43cad318-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'c493fa6a6332b55a95945b77c6a1c8635b687485499ba974a55e3b41e1b4f18d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.878754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43cae330-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a02546345b2c3e65dd8592c55eba3a6159a1e826b082139eb36fbff3451082cc'}]}, 'timestamp': '2025-11-23 09:39:10.879617', '_unique_id': '96062e6e82a34797b56eaedae9574a26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d70cb48-1605-4123-827c-be36a75b3707', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.881783', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cb49ba-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'd317eea9db6a0ad2663087a992bfc8faf3c945e227dd186b7f80be398bf3f1b7'}]}, 'timestamp': '2025-11-23 09:39:10.882271', '_unique_id': 'de235eb9297646809b1a00a2b6873b68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.884 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.884 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f126ba-a43a-4f25-8bda-edfd74973a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.884432', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cbafd6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'dc9db59f33dc5a54907f6ece44f2890accbbb3fd0dcdac97abb5d1b11518f355'}]}, 'timestamp': '2025-11-23 09:39:10.884887', '_unique_id': '4d8ffd54efc745c187c27ea00aac1fed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26aedab3-7212-49d5-a9a3-3337cf36e89c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.887144', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cc1a0c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'b8fba024a510342b770d8f97eda3a87a07f7b6672b5ef99601b379116419d83b'}]}, 'timestamp': '2025-11-23 09:39:10.887607', '_unique_id': 'cb185879020943ea82d8d17edf2f65e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd6a0b5f-6deb-4409-93f5-b62fcdd40a85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.889954', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43cc87f8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': '94d7c571098ef1c9620017a72e2cfd3419aa99fb78ad2750a0b74bfe40659fbd'}]}, 'timestamp': '2025-11-23 09:39:10.890418', '_unique_id': 'ccdecf10cd164a09b5987ffee4b06872'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f15e928-1c31-4a4b-b18d-29c18e50feba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.892682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43ccf2f6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'b772589ab329dbdedbe46eddb49221cc322e3e392f6f4e304f67a64bf2bb2d36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.892682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43cd0340-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '9056fcb66f5d3b63bb9f46dd0995de456392bf0563ffb6d9f9b8769432560f1b'}]}, 'timestamp': '2025-11-23 09:39:10.893545', '_unique_id': '7ff9512b81ff45d08af09d4f104b141d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 54830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50b4e12c-df95-4d91-bbb6-693fdecb636c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:39:10.895671', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '43d072a0-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.09307574, 'message_signature': '6dadfd81686c8c62c2a3ff1befe2434bb6b1459a8039604a2b8974cc5e8ee926'}]}, 'timestamp': '2025-11-23 09:39:10.916179', '_unique_id': 'd7ae50739e9b450eb80b49408ff56bf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1db215b-eedf-4eeb-94fb-adca9d0407f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.918369', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d0dd8a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'f204cf10c2fc131203f6a77e663037374c1609cd3a7255031a40616b5f6a3dec'}]}, 'timestamp': '2025-11-23 09:39:10.918822', '_unique_id': '30b5b067247a462a8e00e51cb23e610b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ae677ab-50c9-4a8f-aae2-841c24afc1c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:39:10.921122', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '43d14c84-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.09307574, 'message_signature': 'bbab5a85930dc69ba38606c852ce3d9364f5f2ea78f6d24ea3d672aeebe56b14'}]}, 'timestamp': '2025-11-23 09:39:10.921690', '_unique_id': '28aa91434eb4440a91b29040f433eeb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1adf894-1b5a-4267-a7a3-d368c6170681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.924071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d1bf98-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a5b3f1f476529e45cd836d8a07ba54fb2f4a6a2898fa3628da6bb4d6233f892d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.924071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d1d582-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '16798d94994d709bd80ce2cb222070dfc215dfdb286d127c164860cf5129de6e'}]}, 'timestamp': '2025-11-23 09:39:10.925164', '_unique_id': '288100af3478401eaf827c4f1c2161cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14e0beb7-c443-4f99-8d0c-f186ab448fbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.927599', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d24c88-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'dff636da534bfd06993639e6a348b83f61eacf0c25bdda03189af805804c60d9'}]}, 'timestamp': '2025-11-23 09:39:10.928325', '_unique_id': '1dd77085c3fe4459b3d8d1b71412d8e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.930 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8b8765c-7437-4a30-a4ee-21550d83526a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.931119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d2d2ca-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': '68a39d5b98271526dbaaad47a511a4de496c028b5c3e70da19076d697785c3a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.931119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d2ea62-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'a6de9e5bfcabdfb5976d3fce9dbf4a64cb181cfffff8a529926a26a26c7d0a78'}]}, 'timestamp': '2025-11-23 09:39:10.932245', '_unique_id': '0805cc5e36b7403382226af8150f6426'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a0ad76c-b9c7-4e13-9a62-302295db2db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:39:10.934535', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '43d35880-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.002412282, 'message_signature': 'c99ae2a34b6194d52a1d5c67793e41b90e9a710d13dd85a6b3a83a27772f6d2b'}]}, 'timestamp': '2025-11-23 09:39:10.935157', '_unique_id': '55e577e029d343699be8a1e030358bc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.938 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07bb8dcd-d345-482c-bef4-ec63977ad005', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:39:10.937483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43d3cb62-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'fb139db06d6eb05f9a6571c2afbf7a6c8d6d6c272bb21c04a3749b7227de3967'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:39:10.937483', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43d3e03e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10694.015016132, 'message_signature': 'd54e0dac08c50ea554494ba5aea9891ab83def2a88301078b0bee464a0e2012f'}]}, 'timestamp': '2025-11-23 09:39:10.938547', '_unique_id': 'd1d52ea509d345aab6dfdda56d7730b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:39:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:39:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.031 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.032 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.529 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.532 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.532 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.533 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.565 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.566 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.709 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.710 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.710 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:39:11 localhost nova_compute[230600]: 2025-11-23 09:39:11.711 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:39:11 localhost podman[240668]: time="2025-11-23T09:39:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:39:11 localhost podman[240668]: @ - - [23/Nov/2025:09:39:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:39:11 localhost podman[240668]: @ - - [23/Nov/2025:09:39:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17223 "" "Go-http-client/1.1" Nov 23 04:39:12 localhost nova_compute[230600]: 2025-11-23 09:39:12.788 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:39:12 localhost nova_compute[230600]: 2025-11-23 09:39:12.809 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:39:12 localhost nova_compute[230600]: 2025-11-23 09:39:12.809 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:39:12 localhost nova_compute[230600]: 2025-11-23 09:39:12.810 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:12 localhost nova_compute[230600]: 2025-11-23 09:39:12.810 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:12 localhost sshd[263400]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:39:13 localhost nova_compute[230600]: 2025-11-23 09:39:13.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:13 localhost nova_compute[230600]: 2025-11-23 09:39:13.718 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:39:14 localhost podman[263479]: 2025-11-23 09:39:14.478478446 +0000 UTC m=+0.085592262 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:39:14 localhost podman[263480]: 2025-11-23 09:39:14.523271643 +0000 UTC m=+0.128485770 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:39:14 localhost podman[263480]: 2025-11-23 09:39:14.540243939 +0000 UTC m=+0.145458136 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:39:14 localhost podman[263479]: 2025-11-23 09:39:14.546328907 +0000 UTC m=+0.153442713 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 04:39:14 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:39:14 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:39:14 localhost nova_compute[230600]: 2025-11-23 09:39:14.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:39:14 localhost nova_compute[230600]: 2025-11-23 09:39:14.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:39:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23316 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D85E3B0000000001030307) Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.566 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.568 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.569 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.569 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.604 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:16 localhost nova_compute[230600]: 2025-11-23 09:39:16.605 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23317 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D862600000000001030307) Nov 23 04:39:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:39:18 localhost podman[263522]: 2025-11-23 09:39:18.027052677 +0000 UTC m=+0.079651279 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:39:18 localhost podman[263522]: 2025-11-23 09:39:18.035713205 +0000 UTC m=+0.088311797 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:39:18 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:39:18 localhost sshd[263540]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:39:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12133 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D866200000000001030307) Nov 23 04:39:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23318 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D86A610000000001030307) Nov 23 04:39:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49026 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1859452193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D86E200000000001030307) Nov 23 04:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:39:21 localhost podman[263542]: 2025-11-23 09:39:21.006187331 +0000 UTC m=+0.058160563 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:39:21 localhost podman[263542]: 2025-11-23 09:39:21.036670545 +0000 UTC m=+0.088643777 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:39:21 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.605 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.607 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.608 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.608 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.642 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.642 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:21 localhost nova_compute[230600]: 2025-11-23 09:39:21.644 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:39:22 localhost podman[263563]: 2025-11-23 09:39:22.026513111 +0000 UTC m=+0.084898121 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3) Nov 23 04:39:22 localhost podman[263563]: 2025-11-23 09:39:22.042885988 +0000 UTC m=+0.101270998 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:39:22 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:39:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23319 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D87A210000000001030307) Nov 23 04:39:23 localhost ovn_controller[154788]: 2025-11-23T09:39:23Z|00048|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.668 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.670 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5025 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.671 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.671 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:26 localhost nova_compute[230600]: 2025-11-23 09:39:26.674 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:29 localhost openstack_network_exporter[242668]: ERROR 09:39:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:39:29 localhost openstack_network_exporter[242668]: ERROR 09:39:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:39:29 localhost openstack_network_exporter[242668]: ERROR 09:39:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:39:29 localhost openstack_network_exporter[242668]: ERROR 09:39:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:39:29 localhost openstack_network_exporter[242668]: Nov 23 04:39:29 localhost openstack_network_exporter[242668]: ERROR 09:39:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:39:29 localhost openstack_network_exporter[242668]: Nov 23 04:39:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23320 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D89A210000000001030307) Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.675 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.678 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.678 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.679 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.712 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:31 localhost nova_compute[230600]: 2025-11-23 09:39:31.713 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:39:35 localhost podman[263581]: 2025-11-23 09:39:35.014708332 +0000 UTC m=+0.074726256 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 23 04:39:35 localhost podman[263581]: 2025-11-23 09:39:35.025475846 +0000 UTC m=+0.085493770 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:39:35 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.715 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.718 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.718 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.719 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.755 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:36 localhost nova_compute[230600]: 2025-11-23 09:39:36.757 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:39:41 localhost podman[263600]: 2025-11-23 09:39:41.030929345 +0000 UTC m=+0.079763011 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:39:41 localhost podman[263600]: 2025-11-23 09:39:41.042731991 +0000 UTC m=+0.091565657 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:39:41 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.758 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.760 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.760 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.761 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.790 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:41 localhost nova_compute[230600]: 2025-11-23 09:39:41.791 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:41 localhost podman[240668]: time="2025-11-23T09:39:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:39:41 localhost podman[240668]: @ - - [23/Nov/2025:09:39:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:39:41 localhost podman[240668]: @ - - [23/Nov/2025:09:39:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Nov 23 04:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:39:45 localhost podman[263624]: 2025-11-23 09:39:45.028961747 +0000 UTC m=+0.083850188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:39:45 localhost podman[263625]: 2025-11-23 09:39:45.077964824 +0000 UTC m=+0.130102719 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Nov 23 04:39:45 localhost podman[263624]: 2025-11-23 09:39:45.091763142 +0000 UTC m=+0.146651583 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:39:45 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:39:45 localhost podman[263625]: 2025-11-23 09:39:45.114780604 +0000 UTC m=+0.166918469 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:39:45 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:39:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6694 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8D36B0000000001030307) Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.792 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.794 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.821 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:46 localhost nova_compute[230600]: 2025-11-23 09:39:46.822 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6695 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8D7610000000001030307) Nov 23 04:39:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23321 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8DA200000000001030307) Nov 23 04:39:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:39:49 localhost podman[263669]: 2025-11-23 09:39:49.009798026 +0000 UTC m=+0.072580048 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:39:49 localhost podman[263669]: 2025-11-23 09:39:49.020337282 +0000 UTC m=+0.083119304 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 23 04:39:49 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:39:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6696 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8DF600000000001030307) Nov 23 04:39:49 localhost sshd[263687]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:39:49 localhost systemd-logind[761]: New session 59 of user zuul. Nov 23 04:39:49 localhost systemd[1]: Started Session 59 of User zuul. Nov 23 04:39:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12134 DF PROTO=TCP SPT=47714 DPT=9102 SEQ=1337110571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8E4200000000001030307) Nov 23 04:39:51 localhost python3.9[263798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:39:51 localhost nova_compute[230600]: 2025-11-23 09:39:51.823 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:39:52 localhost podman[263820]: 2025-11-23 09:39:52.028256034 +0000 UTC m=+0.085108594 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:39:52 localhost podman[263820]: 2025-11-23 09:39:52.063314672 +0000 UTC m=+0.120167262 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:39:52 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:39:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:39:52 localhost systemd[1]: tmp-crun.fT4Klo.mount: Deactivated successfully. Nov 23 04:39:52 localhost podman[263844]: 2025-11-23 09:39:52.188534499 +0000 UTC m=+0.085347791 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:39:52 localhost podman[263844]: 2025-11-23 09:39:52.204284793 +0000 UTC m=+0.101098085 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd) Nov 23 04:39:52 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:39:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6697 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D8EF210000000001030307) Nov 23 04:39:53 localhost python3.9[263953]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:39:53 localhost network[263970]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:39:53 localhost network[263971]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:39:53 localhost network[263972]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:39:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.826 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.828 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.828 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.829 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.874 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:39:56 localhost nova_compute[230600]: 2025-11-23 09:39:56.875 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:39:59 localhost openstack_network_exporter[242668]: ERROR 09:39:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:39:59 localhost openstack_network_exporter[242668]: ERROR 09:39:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:39:59 localhost openstack_network_exporter[242668]: ERROR 09:39:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:39:59 localhost openstack_network_exporter[242668]: ERROR 09:39:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:39:59 localhost openstack_network_exporter[242668]: Nov 23 04:39:59 localhost openstack_network_exporter[242668]: ERROR 09:39:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:39:59 localhost openstack_network_exporter[242668]: Nov 23 04:40:00 localhost python3.9[264206]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 23 04:40:01 localhost python3.9[264269]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:40:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6698 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D910210000000001030307) Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.876 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.879 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.909 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:01 localhost nova_compute[230600]: 2025-11-23 09:40:01.910 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:02 localhost nova_compute[230600]: 2025-11-23 09:40:02.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:02 localhost nova_compute[230600]: 2025-11-23 09:40:02.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 04:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:40:06 localhost podman[264327]: 2025-11-23 09:40:06.030977991 +0000 UTC m=+0.088551122 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:40:06 localhost podman[264327]: 2025-11-23 09:40:06.039354033 +0000 UTC m=+0.096927124 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:40:06 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:40:06 localhost python3.9[264401]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.735 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.736 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.736 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.749 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.910 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.912 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.913 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.913 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.943 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:06 localhost nova_compute[230600]: 2025-11-23 09:40:06.944 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:07 localhost python3.9[264511]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:40:09 localhost python3.9[264622]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:40:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:40:09.251 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:40:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:40:09.252 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:40:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:40:09.253 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.730 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.730 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.768 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.768 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.769 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.769 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:40:09 localhost nova_compute[230600]: 2025-11-23 09:40:09.770 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:40:10 localhost python3.9[264754]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.185 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.298 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.299 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.530 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.532 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12150MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.533 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.533 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.774 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.775 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.775 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:40:10 localhost nova_compute[230600]: 2025-11-23 09:40:10.840 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:40:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.306 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.314 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:40:11 localhost podman[264886]: 2025-11-23 09:40:11.325362967 +0000 UTC m=+0.086062983 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:40:11 localhost podman[264886]: 2025-11-23 09:40:11.333137941 +0000 UTC m=+0.093838017 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:40:11 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.371 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.373 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.373 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:40:11 localhost python3.9[264887]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:40:11 localhost podman[240668]: time="2025-11-23T09:40:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:40:11 localhost podman[240668]: @ - - [23/Nov/2025:09:40:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:40:11 localhost podman[240668]: @ - - [23/Nov/2025:09:40:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1" Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.944 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.948 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.948 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.949 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.972 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:11 localhost nova_compute[230600]: 2025-11-23 09:40:11.973 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.355 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.356 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:40:12 localhost python3.9[265021]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.747 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:40:12 localhost nova_compute[230600]: 2025-11-23 09:40:12.748 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:40:13 localhost nova_compute[230600]: 2025-11-23 09:40:13.840 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:40:13 localhost nova_compute[230600]: 2025-11-23 09:40:13.862 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:40:13 localhost nova_compute[230600]: 2025-11-23 09:40:13.862 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:40:13 localhost nova_compute[230600]: 2025-11-23 09:40:13.863 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:13 localhost nova_compute[230600]: 2025-11-23 09:40:13.864 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:13 localhost python3.9[265133]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:40:14 localhost network[265150]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:40:14 localhost network[265151]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:40:14 localhost network[265152]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:40:14 localhost nova_compute[230600]: 2025-11-23 09:40:14.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:14 localhost nova_compute[230600]: 2025-11-23 09:40:14.737 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:40:15 localhost podman[265242]: 2025-11-23 09:40:15.212851681 +0000 UTC m=+0.062557978 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 04:40:15 localhost podman[265232]: 2025-11-23 09:40:15.279387784 +0000 UTC m=+0.142327585 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:40:15 localhost podman[265242]: 2025-11-23 09:40:15.306809492 +0000 UTC m=+0.156515859 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, config_id=edpm, name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 04:40:15 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:40:15 localhost podman[265232]: 2025-11-23 09:40:15.332395122 +0000 UTC m=+0.195334933 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:40:15 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:40:15 localhost nova_compute[230600]: 2025-11-23 09:40:15.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:15 localhost nova_compute[230600]: 2025-11-23 09:40:15.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:40:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40350 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9489D0000000001030307) Nov 23 04:40:16 localhost nova_compute[230600]: 2025-11-23 09:40:16.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:16 localhost nova_compute[230600]: 2025-11-23 09:40:16.974 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:16 localhost nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:16 localhost nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:16 localhost nova_compute[230600]: 2025-11-23 09:40:16.976 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:17 localhost nova_compute[230600]: 2025-11-23 09:40:17.014 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:17 localhost nova_compute[230600]: 2025-11-23 09:40:17.014 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40351 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D94CA00000000001030307) Nov 23 04:40:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:40:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6699 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D950200000000001030307) Nov 23 04:40:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:40:19 localhost podman[265397]: 2025-11-23 09:40:19.140295565 +0000 UTC m=+0.075603207 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:40:19 localhost podman[265397]: 2025-11-23 09:40:19.149211113 +0000 UTC m=+0.084518755 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 04:40:19 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:40:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40352 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D954A00000000001030307) Nov 23 04:40:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23322 DF PROTO=TCP SPT=49426 DPT=9102 SEQ=2961114951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D958200000000001030307) Nov 23 04:40:20 localhost python3.9[265533]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 04:40:21 localhost python3.9[265643]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.015 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.017 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.042 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:22 localhost nova_compute[230600]: 2025-11-23 09:40:22.042 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:40:23 localhost systemd[1]: tmp-crun.oaA4JS.mount: Deactivated successfully. Nov 23 04:40:23 localhost podman[265753]: 2025-11-23 09:40:23.032957249 +0000 UTC m=+0.087137658 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:40:23 localhost podman[265754]: 2025-11-23 09:40:23.018379522 +0000 UTC m=+0.071665063 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:40:23 localhost podman[265754]: 2025-11-23 09:40:23.101424621 +0000 UTC m=+0.154710212 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:40:23 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:40:23 localhost podman[265753]: 2025-11-23 09:40:23.121623853 +0000 UTC m=+0.175804322 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd) Nov 23 04:40:23 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:40:23 localhost python3.9[265755]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40353 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D964600000000001030307) Nov 23 04:40:23 localhost python3.9[265848]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:24 localhost python3.9[265958]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:25 localhost python3.9[266068]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:25 localhost python3.9[266178]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:40:26 localhost python3.9[266290]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.043 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.045 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.046 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.086 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:27 localhost nova_compute[230600]: 2025-11-23 09:40:27.087 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:27 localhost python3.9[266402]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:40:28 localhost python3.9[266513]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:29 localhost python3.9[266623]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:29 localhost openstack_network_exporter[242668]: ERROR 09:40:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:40:29 localhost openstack_network_exporter[242668]: ERROR 09:40:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:40:29 localhost openstack_network_exporter[242668]: ERROR 09:40:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:40:29 localhost openstack_network_exporter[242668]: ERROR 09:40:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:40:29 localhost openstack_network_exporter[242668]: Nov 23 04:40:29 localhost openstack_network_exporter[242668]: ERROR 09:40:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:40:29 localhost openstack_network_exporter[242668]: Nov 23 04:40:29 localhost python3.9[266733]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:30 localhost python3.9[266843]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:31 localhost python3.9[266953]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40354 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D984200000000001030307) Nov 23 04:40:32 localhost nova_compute[230600]: 2025-11-23 09:40:32.087 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:32 localhost nova_compute[230600]: 2025-11-23 09:40:32.089 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:32 localhost python3.9[267063]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:40:34 localhost python3.9[267175]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:35 localhost python3.9[267285]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:35 localhost python3.9[267342]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:40:36 localhost podman[267453]: 2025-11-23 09:40:36.388960791 +0000 UTC m=+0.115369501 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118) Nov 23 04:40:36 localhost podman[267453]: 2025-11-23 09:40:36.402433472 +0000 UTC m=+0.128842172 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:40:36 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:40:36 localhost python3.9[267452]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:36 localhost python3.9[267528]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:37 localhost nova_compute[230600]: 2025-11-23 09:40:37.090 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:37 localhost nova_compute[230600]: 2025-11-23 09:40:37.092 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:37 localhost python3.9[267638]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:39 localhost python3.9[267748]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:39 localhost python3.9[267805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:40 localhost python3.9[267915]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:40 localhost python3.9[267972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:41 localhost podman[240668]: time="2025-11-23T09:40:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:40:41 localhost podman[240668]: @ - - [23/Nov/2025:09:40:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:40:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:40:41 localhost podman[240668]: @ - - [23/Nov/2025:09:40:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 23 04:40:42 localhost systemd[1]: tmp-crun.OrlnuT.mount: Deactivated successfully. Nov 23 04:40:42 localhost podman[268082]: 2025-11-23 09:40:42.026499776 +0000 UTC m=+0.092512256 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:40:42 localhost podman[268082]: 2025-11-23 09:40:42.036538919 +0000 UTC m=+0.102551399 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:40:42 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:40:42 localhost nova_compute[230600]: 2025-11-23 09:40:42.093 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:42 localhost nova_compute[230600]: 2025-11-23 09:40:42.094 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:42 localhost python3.9[268083]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:40:42 localhost systemd[1]: Reloading. Nov 23 04:40:42 localhost systemd-rc-local-generator[268130]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:40:42 localhost systemd-sysv-generator[268133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:43 localhost python3.9[268252]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:43 localhost python3.9[268309]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:44 localhost python3.9[268419]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:44 localhost python3.9[268476]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:40:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:40:45 localhost podman[268588]: 2025-11-23 09:40:45.533968407 +0000 UTC m=+0.066303616 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 04:40:45 localhost podman[268588]: 2025-11-23 09:40:45.547318344 +0000 UTC m=+0.079653543 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, architecture=x86_64) Nov 23 04:40:45 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:40:45 localhost systemd[1]: tmp-crun.Jh0ZBS.mount: Deactivated successfully. Nov 23 04:40:45 localhost podman[268587]: 2025-11-23 09:40:45.607051244 +0000 UTC m=+0.138069672 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:40:45 localhost podman[268587]: 2025-11-23 09:40:45.645436294 +0000 UTC m=+0.176454732 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 04:40:45 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:40:45 localhost python3.9[268586]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:40:45 localhost systemd[1]: Reloading. Nov 23 04:40:45 localhost systemd-sysv-generator[268661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:40:45 localhost systemd-rc-local-generator[268655]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:40:46 localhost systemd[1]: Starting Create netns directory... Nov 23 04:40:46 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 23 04:40:46 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 23 04:40:46 localhost systemd[1]: Finished Create netns directory. Nov 23 04:40:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27679 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9BDCB0000000001030307) Nov 23 04:40:47 localhost nova_compute[230600]: 2025-11-23 09:40:47.094 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:47 localhost nova_compute[230600]: 2025-11-23 09:40:47.097 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:47 localhost python3.9[268783]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27680 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C1E00000000001030307) Nov 23 04:40:47 localhost python3.9[268893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40355 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C4210000000001030307) Nov 23 04:40:48 localhost python3.9[268950]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:40:49 localhost podman[269061]: 2025-11-23 09:40:49.262814417 +0000 UTC m=+0.075077080 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent) Nov 23 04:40:49 localhost podman[269061]: 2025-11-23 09:40:49.294763046 +0000 UTC m=+0.107025749 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:40:49 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:40:49 localhost python3.9[269060]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:40:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27681 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9C9E00000000001030307) Nov 23 04:40:50 localhost python3.9[269189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:40:50 localhost nova_compute[230600]: 2025-11-23 09:40:50.333 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:40:50 localhost nova_compute[230600]: 2025-11-23 09:40:50.353 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 23 04:40:50 localhost nova_compute[230600]: 2025-11-23 09:40:50.354 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:40:50 localhost nova_compute[230600]: 2025-11-23 09:40:50.354 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:40:50 localhost nova_compute[230600]: 2025-11-23 09:40:50.407 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:40:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6700 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3734280524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9CE200000000001030307) Nov 23 04:40:50 localhost python3.9[269246]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.rf31qfvi recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.097 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.098 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.099 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.099 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.100 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:40:52 localhost nova_compute[230600]: 2025-11-23 09:40:52.104 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:52 localhost python3.9[269356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:40:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:40:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:40:53 localhost podman[269525]: 2025-11-23 09:40:53.405765814 +0000 UTC m=+0.076471365 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:40:53 localhost podman[269525]: 2025-11-23 09:40:53.413833147 +0000 UTC m=+0.084538648 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:40:53 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:40:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27682 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9D9A00000000001030307) Nov 23 04:40:53 localhost systemd[1]: tmp-crun.2VjDmq.mount: Deactivated successfully. Nov 23 04:40:53 localhost podman[269524]: 2025-11-23 09:40:53.473980578 +0000 UTC m=+0.146814834 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:40:53 localhost podman[269524]: 2025-11-23 09:40:53.487188082 +0000 UTC m=+0.160022368 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 04:40:53 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:40:55 localhost python3.9[269676]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 23 04:40:56 localhost python3.9[269786]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:40:57 localhost nova_compute[230600]: 2025-11-23 09:40:57.102 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:40:57 localhost python3.9[269896]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 23 04:40:59 localhost openstack_network_exporter[242668]: ERROR 09:40:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:40:59 localhost openstack_network_exporter[242668]: ERROR 09:40:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:40:59 localhost openstack_network_exporter[242668]: ERROR 09:40:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:40:59 localhost openstack_network_exporter[242668]: ERROR 09:40:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:40:59 localhost openstack_network_exporter[242668]: Nov 23 04:40:59 localhost openstack_network_exporter[242668]: ERROR 09:40:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:40:59 localhost openstack_network_exporter[242668]: Nov 23 04:41:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27683 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2D9FA210000000001030307) Nov 23 04:41:02 localhost nova_compute[230600]: 2025-11-23 09:41:02.104 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:02 localhost python3[270033]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:41:02 localhost python3[270033]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072",#012 "Digest": "sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:11:34.680484424Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 249489385,#012 "VirtualSize": 249489385,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/4f2d68ffc7b0d2ad7154f194ce01f6add8f68d1c87ebccb7dfe58b78cf788c91/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:d9e3e9c6b6b086eeb756b403557bba77ecef73e97936fb3285a5484cd95a1b1a"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:39.924297673Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:40.346524368Z",#012 Nov 23 04:41:03 localhost python3.9[270206]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:41:04 localhost python3.9[270318]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:04 localhost python3.9[270373]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:41:05 localhost python3.9[270482]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890864.8811224-1365-159441197507467/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:06 localhost python3.9[270537]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:41:07 localhost podman[270557]: 2025-11-23 09:41:07.032130873 +0000 UTC m=+0.085324731 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:41:07 localhost podman[270557]: 2025-11-23 09:41:07.07039541 +0000 UTC m=+0.123589228 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:41:07 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.108 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.109 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.109 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.110 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.110 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:07 localhost nova_compute[230600]: 2025-11-23 09:41:07.113 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:08 localhost python3.9[270666]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:41:08 localhost nova_compute[230600]: 2025-11-23 09:41:08.739 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:41:09.275 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:41:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:41:09.276 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:41:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:41:09.277 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:41:09 localhost python3.9[270776]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.803 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a42e57-cbe3-4dbb-a989-6f01bfe21027', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.804166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b4b1b3a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '27729339740b1614bf019b97411d824c12d181ab1f2a148f947bb085700d926d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.804166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b4b2e90-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '7080f87f7e3b005d2c3dcc72cc18e7ae63af121842224e8527e2a5a4adade0f5'}]}, 'timestamp': '2025-11-23 09:41:10.838656', '_unique_id': 'c8b7fae9de874bc68875ad1fc169c898'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b920102-564b-49c0-9dbf-36f79c02212b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:41:10.841796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8b4f13ac-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.041101861, 'message_signature': '754ae71fc1d2a2523fee484737491ff3b1c0288be71bbc8c65055a2421812b62'}]}, 'timestamp': '2025-11-23 09:41:10.864271', '_unique_id': 'd326c4fa784648a2a6f22a6c521c1b45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.869 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4af77c4-34ef-47e5-8722-6b7418e9fa39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.866777', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b4ffd6c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '35677048132aa0473c69c486c860677a5145eb5a6754d2c2708506e1234cb20b'}]}, 'timestamp': '2025-11-23 09:41:10.870222', '_unique_id': '64edd949590e4062aec5f827e622c52a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.871 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34847494-690c-4a46-b699-3bb28627ca8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.873045', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b5080d4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '35722738a8de5b139adb88d9da1721f7c107c93289f813eebfb348f2e2d5abc6'}]}, 'timestamp': '2025-11-23 09:41:10.873551', '_unique_id': 'ee35aaa543614047b263a468f924c9b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.875 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41694fbe-b1ec-4732-98cc-6a3f3fde6bde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.875662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b52da32-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'fd96601f110761630c227e77a6e96883e21e20096b34da70119aacc7774ba706'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.875662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b52ec66-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '105264140730d331b7ecfb51296b64af06736a77cda364c1c11206e13ccfb91e'}]}, 'timestamp': '2025-11-23 09:41:10.889380', '_unique_id': '9e23569aa48c4d8f92eded3277abc4cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f16e5c8-a0c3-45de-ad7f-8b51347b701f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.892010', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b53659c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'cd2dd67fadf94244822d369c52cf0080317f4868504f05138dcd99de1fb12e11'}]}, 'timestamp': '2025-11-23 09:41:10.892513', '_unique_id': '3f5c5aa5ba224260bce5f32998932e95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30bdcf20-89f2-45ad-8e2d-f2a4c6cf8187', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.894682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b53cd5c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '3adccb94deaea36080d0626f91cedfa046265fa86724e5fb943340d5942e6cec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.894682', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b53dd9c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'f9e42b64f192843f73968b156340be646b055452dce93ccd1cdb293bacfb24df'}]}, 'timestamp': '2025-11-23 09:41:10.895552', '_unique_id': '89e1ba0fc4364254b2d13ac073c81cd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd1dcd36-4abc-45fc-ae1f-08593586da00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.897719', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b54441c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '678ab4852825478dc8c6750141725c7e2217129e7dcd2c13a2a0764c481e823e'}]}, 'timestamp': '2025-11-23 09:41:10.898206', '_unique_id': '09a52185a282430ca803480108af7451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de51a33e-8fb3-45bc-be92-b28ac3c048a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.900300', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b54a77c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'd29a153a363918c67208feeb3738bbd86711620d6804eaca83505559c44f135c'}]}, 'timestamp': '2025-11-23 09:41:10.900748', '_unique_id': 'ae07d79e33fb4c369d212723a45e4236'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c25df58-7efb-4208-ac87-a62a170bddc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.902808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b550aa0-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '6a180e936486056eff0087d7e56bbacf30310f5c07eccde36babb6f31f172b77'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.902808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b551a9a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'e29b0066423dbd5090236f0d93308369302f8cdf2ac9acf722601c85ddaca93a'}]}, 'timestamp': '2025-11-23 09:41:10.903667', '_unique_id': 'c4dd42ccc2e84b2f90ac7233b4121b29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fbd2269-a2f9-48a7-b5fc-0b1f808e787c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.905809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b55802a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'fa1543d688da777754f71b3bc208ad4694d72feba6967227c15f25f93029ea81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.905809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b558ff2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'fab7b7abc384e028dfef55e8dbea63b401e81be11a2b0e324c6db87ef9506dc3'}]}, 'timestamp': '2025-11-23 09:41:10.906668', '_unique_id': '4722f4ee58cf4295804d79951d748ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.908 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 55830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e946d2dd-7d10-4a3f-a0b8-c5a4b3d59e70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:41:10.908966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8b55fa0a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.041101861, 'message_signature': '1d7d176bdf1bcccab2105beecee5573bd9464c85a5723cad6bba37e4c5f47924'}]}, 'timestamp': '2025-11-23 09:41:10.909399', '_unique_id': '51249c4142ac4a76832eacfb2bbc7f5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92e9ab2-9784-4f9a-83ad-654a66498228', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.911480', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b565c84-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '6f38f75e3b3bded32d12e5b1d2f7eb7c57a42d141a01f727fed04112804c7c06'}]}, 'timestamp': '2025-11-23 09:41:10.911973', '_unique_id': '8052cc825554485c944e2456480063c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48cca1e8-caa1-47b5-82d2-e163e8c1b65e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.914630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b56d7c2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '22bc33d64d239c809b6965b71d21f3c8bad7d123cf200b05ed907ac4cac56733'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.914630', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b56ef00-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'a58358bbb4b21f009f9f289f286b2edb2018f6c0bd68d5289cb70566f4d11502'}]}, 'timestamp': '2025-11-23 09:41:10.915672', '_unique_id': '1febe2876caf448aa549a52a3eaf49e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf45ed6e-8ebd-4c17-93f4-9036587d60cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.918087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b575ecc-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': '69762c25056fdb9eda583e52e5b2c34feab4d9ab466a9b20f8031cb7c3f2b9c7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.918087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b577402-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10813.981795226, 'message_signature': 'db60160a3c2ade5ffd1a17be18c5a0b654e748827349fa1f01cc656efe263b31'}]}, 'timestamp': '2025-11-23 09:41:10.919194', '_unique_id': '37f512c592ce4f5fa4c1d8fdd66412a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a72c3632-6fa1-42c1-9d31-5c392667f291', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.921348', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b57dab4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'a723af9022739f1c749722f01071a0003b66d62f44881d3b62108d78691b069d'}]}, 'timestamp': '2025-11-23 09:41:10.921635', '_unique_id': '737078cbb55140a7bacff86a17e1a9fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab6b8275-20e1-4ea2-8174-160f29c2a3e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.922941', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b5818da-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': 'ed7c4ff51ffa1df761635fa3b675bca36151ddd458b08e5225a5b1afe47302bf'}]}, 'timestamp': '2025-11-23 09:41:10.923225', '_unique_id': 'd7c0cc9d13be45b48037731325fa5e2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '486baf48-8973-419f-a172-c46937319fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.924515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b58562e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'c3b4d0188b8719e2ab060a961ac31e49ad43a0e50df07d23f7b686dbf94fadad'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.924515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b5860a6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': 'c920b7ef858bd9fdb2c0b05a03f3cb4803582be841f75f924a232a5679d09eaa'}]}, 'timestamp': '2025-11-23 09:41:10.925046', '_unique_id': 'ba375ad520a44965ae41869a97bb2924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fb913b5-58aa-4fe4-a8ca-710a43e58e4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.926515', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b58a462-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '1cb20d14cf93b2111eb96e1ec338b4026118389d2128ddb31d7af63ff8ab7724'}]}, 'timestamp': '2025-11-23 09:41:10.926796', '_unique_id': '3a3d2f396d5e442193555cd27a6688d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15c4a913-1b0b-4154-a29b-9ab5df69ab09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:41:10.928234', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8b58e7c4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.044408376, 'message_signature': '47b8cb313d901cb71d4f3447791078aaa0990ed13a6aab37bc19561bc60293de'}]}, 'timestamp': '2025-11-23 09:41:10.928525', '_unique_id': '43de97dda29e47e1a9292a1102d4fc65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe4b43c-d01f-414b-a2c5-95ce83daac61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:41:10.929856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8b592824-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '064bd7bdc9a699defa3c1fd2076ba025f9dc62588aa2034ea463c98beba329d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:41:10.929856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8b5931f2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10814.053289403, 'message_signature': '9196f3ce4b0ced2e7a2c46afb64019eadff96d4698479ca8c59edd572dee726a'}]}, 'timestamp': '2025-11-23 09:41:10.930404', '_unique_id': '1cf92fa132164437820a0eca47ec5e0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:41:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:41:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:41:11 localhost python3.9[270886]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 23 04:41:11 localhost python3.9[270996]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 23 04:41:11 localhost nova_compute[230600]: 2025-11-23 09:41:11.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:11 localhost nova_compute[230600]: 2025-11-23 09:41:11.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:41:11 localhost nova_compute[230600]: 2025-11-23 09:41:11.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:41:11 localhost podman[240668]: time="2025-11-23T09:41:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:41:11 localhost podman[240668]: @ - - [23/Nov/2025:09:41:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:41:11 localhost podman[240668]: @ - - [23/Nov/2025:09:41:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1" Nov 23 04:41:12 localhost nova_compute[230600]: 2025-11-23 09:41:12.111 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:41:12 localhost systemd[1]: tmp-crun.LzQELM.mount: Deactivated successfully. Nov 23 04:41:12 localhost podman[271107]: 2025-11-23 09:41:12.353799303 +0000 UTC m=+0.083180953 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:41:12 localhost podman[271107]: 2025-11-23 09:41:12.38851593 +0000 UTC m=+0.117897540 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:41:12 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:41:12 localhost python3.9[271106]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:41:12 localhost nova_compute[230600]: 2025-11-23 09:41:12.791 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:41:12 localhost nova_compute[230600]: 2025-11-23 09:41:12.792 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:41:12 localhost nova_compute[230600]: 2025-11-23 09:41:12.792 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:41:12 localhost nova_compute[230600]: 2025-11-23 09:41:12.793 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:41:12 localhost python3.9[271186]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:13 localhost python3.9[271296]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.813 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.837 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.838 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.838 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.839 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.858 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.858 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.859 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.859 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:41:13 localhost nova_compute[230600]: 2025-11-23 09:41:13.860 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.265 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.319 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.320 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.553 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.554 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12124MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.555 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.555 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:41:14 localhost python3.9[271428]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.686 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.687 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.687 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.703 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.759 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.760 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.778 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.817 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: HW_CPU_X86_F16C,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_RESCUE_BFV,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_BMI,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:41:14 localhost nova_compute[230600]: 2025-11-23 09:41:14.864 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:41:15 localhost nova_compute[230600]: 2025-11-23 09:41:15.279 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:41:15 localhost nova_compute[230600]: 2025-11-23 09:41:15.286 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:41:15 localhost nova_compute[230600]: 2025-11-23 09:41:15.304 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:41:15 localhost nova_compute[230600]: 2025-11-23 09:41:15.307 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:41:15 localhost nova_compute[230600]: 2025-11-23 09:41:15.307 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:41:16 localhost podman[271453]: 2025-11-23 09:41:16.016363849 +0000 UTC m=+0.076213105 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 23 04:41:16 localhost systemd[1]: tmp-crun.GPml6p.mount: Deactivated successfully. Nov 23 04:41:16 localhost podman[271453]: 2025-11-23 09:41:16.082510099 +0000 UTC m=+0.142359365 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:41:16 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:41:16 localhost podman[271454]: 2025-11-23 09:41:16.090058685 +0000 UTC m=+0.147296459 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9) Nov 23 04:41:16 localhost podman[271454]: 2025-11-23 09:41:16.174326662 +0000 UTC m=+0.231564426 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7) Nov 23 04:41:16 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:41:16 localhost nova_compute[230600]: 2025-11-23 09:41:16.303 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:16 localhost nova_compute[230600]: 2025-11-23 09:41:16.304 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:16 localhost nova_compute[230600]: 2025-11-23 09:41:16.304 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:41:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6841 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA32FB0000000001030307) Nov 23 04:41:16 localhost nova_compute[230600]: 2025-11-23 09:41:16.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:41:17 localhost nova_compute[230600]: 2025-11-23 09:41:17.113 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6842 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA37200000000001030307) Nov 23 04:41:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27684 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA3A210000000001030307) Nov 23 04:41:18 localhost python3.9[271725]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 23 04:41:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6843 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA3F210000000001030307) Nov 23 04:41:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:41:19 localhost podman[271840]: 2025-11-23 09:41:19.60274657 +0000 UTC m=+0.071196959 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 23 04:41:19 localhost podman[271840]: 2025-11-23 09:41:19.635039021 +0000 UTC m=+0.103489429 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Nov 23 04:41:19 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:41:19 localhost python3.9[271839]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40356 DF PROTO=TCP SPT=35852 DPT=9102 SEQ=3069657994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA42200000000001030307) Nov 23 04:41:20 localhost python3.9[271967]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:41:20 localhost systemd[1]: Reloading. Nov 23 04:41:21 localhost systemd-rc-local-generator[272010]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:41:21 localhost systemd-sysv-generator[272015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:22 localhost nova_compute[230600]: 2025-11-23 09:41:22.115 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:22 localhost nova_compute[230600]: 2025-11-23 09:41:22.120 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:22 localhost python3.9[272130]: ansible-ansible.builtin.service_facts Invoked Nov 23 04:41:22 localhost network[272147]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 23 04:41:22 localhost network[272148]: 'network-scripts' will be removed from distribution in near future. Nov 23 04:41:22 localhost network[272149]: It is advised to switch to 'NetworkManager' instead for network management. Nov 23 04:41:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6844 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA4EE00000000001030307) Nov 23 04:41:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:41:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:41:23 localhost systemd[1]: tmp-crun.G4rFQF.mount: Deactivated successfully. Nov 23 04:41:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:41:23 localhost podman[272198]: 2025-11-23 09:41:23.633490405 +0000 UTC m=+0.104385997 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:41:23 localhost podman[272185]: 2025-11-23 09:41:23.590996995 +0000 UTC m=+0.118678335 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:41:23 localhost podman[272198]: 2025-11-23 09:41:23.648276157 +0000 UTC m=+0.119171789 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 04:41:23 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:41:23 localhost podman[272185]: 2025-11-23 09:41:23.676390757 +0000 UTC m=+0.204072107 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:41:23 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:41:27 localhost nova_compute[230600]: 2025-11-23 09:41:27.117 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:28 localhost python3.9[272425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:28 localhost python3.9[272536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:29 localhost python3.9[272647]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:29 localhost openstack_network_exporter[242668]: ERROR 09:41:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:41:29 localhost openstack_network_exporter[242668]: ERROR 09:41:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:41:29 localhost openstack_network_exporter[242668]: ERROR 09:41:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:41:29 localhost openstack_network_exporter[242668]: ERROR 09:41:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:41:29 localhost openstack_network_exporter[242668]: Nov 23 04:41:29 localhost openstack_network_exporter[242668]: ERROR 09:41:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:41:29 localhost openstack_network_exporter[242668]: Nov 23 04:41:30 localhost python3.9[272758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:31 localhost python3.9[272869]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:31 localhost python3.9[272980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6845 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DA70210000000001030307) Nov 23 04:41:32 localhost nova_compute[230600]: 2025-11-23 09:41:32.120 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:32 localhost python3.9[273091]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:33 localhost python3.9[273202]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:41:37 localhost nova_compute[230600]: 2025-11-23 09:41:37.122 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:37 localhost python3.9[273313]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:41:37 localhost podman[273423]: 2025-11-23 09:41:37.6431132 +0000 UTC m=+0.074904065 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:41:37 localhost podman[273423]: 2025-11-23 09:41:37.654239939 +0000 UTC m=+0.086030804 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:41:37 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:41:37 localhost python3.9[273424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:39 localhost python3.9[273555]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:40 localhost python3.9[273665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:41 localhost python3.9[273775]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:41 localhost podman[240668]: time="2025-11-23T09:41:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:41:41 localhost podman[240668]: @ - - [23/Nov/2025:09:41:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:41:41 localhost podman[240668]: @ - - [23/Nov/2025:09:41:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.126 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.127 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.128 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.128 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.144 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:42 localhost nova_compute[230600]: 2025-11-23 09:41:42.145 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:41:42 localhost podman[273886]: 2025-11-23 09:41:42.585963817 +0000 UTC m=+0.085894098 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:41:42 localhost podman[273886]: 2025-11-23 09:41:42.593532594 +0000 UTC m=+0.093462925 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:41:42 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:41:42 localhost python3.9[273885]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:43 localhost python3.9[274016]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:43 localhost python3.9[274126]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:44 localhost python3.9[274236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:45 localhost python3.9[274346]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:45 localhost python3.9[274456]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:41:46 localhost podman[274566]: 2025-11-23 09:41:46.290276119 +0000 UTC m=+0.090284187 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:41:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3795 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAA82B0000000001030307) Nov 23 04:41:46 localhost podman[274568]: 2025-11-23 09:41:46.262865531 +0000 UTC m=+0.061689801 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7) Nov 23 04:41:46 localhost podman[274568]: 2025-11-23 09:41:46.384037713 +0000 UTC m=+0.182862023 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350) Nov 23 04:41:46 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:41:46 localhost podman[274566]: 2025-11-23 09:41:46.399231618 +0000 UTC m=+0.199239676 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 04:41:46 localhost python3.9[274567]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:46 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:41:47 localhost python3.9[274719]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:47 localhost nova_compute[230600]: 2025-11-23 09:41:47.146 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:47 localhost nova_compute[230600]: 2025-11-23 09:41:47.152 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3796 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAAC210000000001030307) Nov 23 04:41:47 localhost python3.9[274829]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:48 localhost python3.9[274939]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6846 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB0200000000001030307) Nov 23 04:41:48 localhost python3.9[275049]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:41:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3797 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB4210000000001030307) Nov 23 04:41:49 localhost python3.9[275159]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:41:49 localhost podman[275162]: 2025-11-23 09:41:49.851595827 +0000 UTC m=+0.076953639 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:41:49 localhost podman[275162]: 2025-11-23 09:41:49.862339584 +0000 UTC m=+0.087697456 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent) Nov 23 04:41:49 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:41:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27685 DF PROTO=TCP SPT=35376 DPT=9102 SEQ=1887070811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAB8200000000001030307) Nov 23 04:41:50 localhost python3.9[275287]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 23 04:41:51 localhost python3.9[275397]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 23 04:41:51 localhost systemd[1]: Reloading. Nov 23 04:41:51 localhost systemd-rc-local-generator[275425]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:41:51 localhost systemd-sysv-generator[275428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.151 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4993-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.153 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.154 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.154 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.179 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:52 localhost nova_compute[230600]: 2025-11-23 09:41:52.180 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:52 localhost python3.9[275543]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:53 localhost python3.9[275654]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3798 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAC3E10000000001030307) Nov 23 04:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:41:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:41:53 localhost systemd[1]: tmp-crun.ULYjwZ.mount: Deactivated successfully. Nov 23 04:41:53 localhost podman[275766]: 2025-11-23 09:41:53.944918121 +0000 UTC m=+0.139918929 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2) Nov 23 04:41:53 localhost podman[275766]: 2025-11-23 09:41:53.955202623 +0000 UTC m=+0.150203421 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:41:53 localhost python3.9[275765]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:53 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:41:54 localhost podman[275767]: 2025-11-23 09:41:53.908068288 +0000 UTC m=+0.103222651 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:41:54 localhost podman[275767]: 2025-11-23 09:41:54.043181546 +0000 UTC m=+0.238335919 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:41:54 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:41:54 localhost python3.9[275917]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:55 localhost python3.9[276028]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:56 localhost python3.9[276139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:57 localhost python3.9[276250]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.181 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.182 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.183 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.183 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.228 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:41:57 localhost nova_compute[230600]: 2025-11-23 09:41:57.228 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:41:58 localhost python3.9[276361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:41:59 localhost openstack_network_exporter[242668]: ERROR 09:41:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:41:59 localhost openstack_network_exporter[242668]: ERROR 09:41:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:41:59 localhost openstack_network_exporter[242668]: ERROR 09:41:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:41:59 localhost openstack_network_exporter[242668]: ERROR 09:41:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:41:59 localhost openstack_network_exporter[242668]: Nov 23 04:41:59 localhost openstack_network_exporter[242668]: ERROR 09:41:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:41:59 localhost openstack_network_exporter[242668]: Nov 23 04:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:42:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4844 writes, 21K keys, 4844 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4844 writes, 618 syncs, 7.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:42:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3799 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DAE4200000000001030307) Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.230 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.231 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.232 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.232 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.296 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:02 localhost nova_compute[230600]: 2025-11-23 09:42:02.296 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:03 localhost python3.9[276472]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:04 localhost python3.9[276582]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:42:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5736 writes, 25K keys, 5736 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5736 writes, 788 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:42:05 localhost python3.9[276692]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:05 localhost python3.9[276802]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:06 localhost python3.9[276912]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:06 localhost python3.9[277022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.298 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.300 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.328 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:07 localhost nova_compute[230600]: 2025-11-23 09:42:07.329 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:07 localhost python3.9[277132]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:42:08 localhost systemd[1]: tmp-crun.JGx2rD.mount: Deactivated successfully. Nov 23 04:42:08 localhost podman[277223]: 2025-11-23 09:42:08.056218871 +0000 UTC m=+0.101924474 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 04:42:08 localhost podman[277223]: 2025-11-23 09:42:08.070271307 +0000 UTC m=+0.115976920 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:42:08 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:42:08 localhost python3.9[277253]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:09 localhost python3.9[277372]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:42:09.276 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:42:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:42:09.277 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:42:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:42:09.278 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:42:09 localhost nova_compute[230600]: 2025-11-23 09:42:09.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:10 localhost python3.9[277482]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:11 localhost nova_compute[230600]: 2025-11-23 09:42:11.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:11 localhost podman[240668]: time="2025-11-23T09:42:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:42:11 localhost podman[240668]: @ - - [23/Nov/2025:09:42:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:42:11 localhost podman[240668]: @ - - [23/Nov/2025:09:42:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1" Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.329 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.331 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.369 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.370 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.718 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:42:12 localhost nova_compute[230600]: 2025-11-23 09:42:12.719 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:42:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:42:13 localhost podman[277500]: 2025-11-23 09:42:13.036647947 +0000 UTC m=+0.087441155 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.039 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.040 230604 DEBUG nova.objects.instance [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:42:13 localhost podman[277500]: 2025-11-23 09:42:13.049162074 +0000 UTC m=+0.099955272 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:42:13 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.827 230604 DEBUG nova.network.neutron [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.843 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.844 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.845 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.845 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.865 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.865 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:42:13 localhost nova_compute[230600]: 2025-11-23 09:42:13.866 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.333 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.416 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.416 230604 DEBUG nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.616 230604 WARNING nova.virt.libvirt.driver [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.619 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12125MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.619 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.620 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.683 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.684 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.684 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:42:14 localhost nova_compute[230600]: 2025-11-23 09:42:14.728 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:42:15 localhost nova_compute[230600]: 2025-11-23 09:42:15.187 230604 DEBUG oslo_concurrency.processutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:42:15 localhost nova_compute[230600]: 2025-11-23 09:42:15.194 230604 DEBUG nova.compute.provider_tree [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:42:15 localhost nova_compute[230600]: 2025-11-23 09:42:15.208 230604 DEBUG nova.scheduler.client.report [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:42:15 localhost nova_compute[230600]: 2025-11-23 09:42:15.210 230604 DEBUG nova.compute.resource_tracker [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:42:15 localhost nova_compute[230600]: 2025-11-23 09:42:15.211 230604 DEBUG oslo_concurrency.lockutils [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:42:16 localhost nova_compute[230600]: 2025-11-23 09:42:16.207 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:16 localhost nova_compute[230600]: 2025-11-23 09:42:16.207 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16619 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB1D5B0000000001030307) Nov 23 04:42:16 localhost nova_compute[230600]: 2025-11-23 09:42:16.716 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:16 localhost nova_compute[230600]: 2025-11-23 09:42:16.717 230604 DEBUG nova.compute.manager [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:42:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:42:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:42:17 localhost podman[277568]: 2025-11-23 09:42:17.021676142 +0000 UTC m=+0.073391541 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=edpm) Nov 23 04:42:17 localhost podman[277568]: 2025-11-23 09:42:17.037516934 +0000 UTC m=+0.089232333 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container) Nov 23 04:42:17 localhost podman[277567]: 2025-11-23 09:42:17.073163905 +0000 UTC m=+0.126808694 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:42:17 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:42:17 localhost podman[277567]: 2025-11-23 09:42:17.168464009 +0000 UTC m=+0.222108778 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, container_name=ovn_controller) Nov 23 04:42:17 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:42:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16620 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB21610000000001030307) Nov 23 04:42:17 localhost nova_compute[230600]: 2025-11-23 09:42:17.372 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:17 localhost nova_compute[230600]: 2025-11-23 09:42:17.717 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3800 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB24200000000001030307) Nov 23 04:42:18 localhost python3.9[277705]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 23 04:42:18 localhost nova_compute[230600]: 2025-11-23 09:42:18.712 230604 DEBUG oslo_service.periodic_task [None req-609f7711-c234-4528-9a48-9d4cb87734b9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:42:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16621 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB29610000000001030307) Nov 23 04:42:19 localhost sshd[277724]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:42:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:42:19 localhost systemd-logind[761]: New session 60 of user zuul. Nov 23 04:42:19 localhost systemd[1]: Started Session 60 of User zuul. Nov 23 04:42:20 localhost podman[277727]: 2025-11-23 09:42:20.023542462 +0000 UTC m=+0.087220258 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:42:20 localhost podman[277727]: 2025-11-23 09:42:20.031179335 +0000 UTC m=+0.094857131 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:42:20 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:42:20 localhost systemd[1]: session-60.scope: Deactivated successfully. Nov 23 04:42:20 localhost systemd-logind[761]: Session 60 logged out. Waiting for processes to exit. Nov 23 04:42:20 localhost systemd-logind[761]: Removed session 60. Nov 23 04:42:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6847 DF PROTO=TCP SPT=38486 DPT=9102 SEQ=2643566931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB2E220000000001030307) Nov 23 04:42:20 localhost python3.9[277851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:21 localhost python3.9[277937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890940.2437236-3039-273523693651941/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:21 localhost python3.9[278096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:22 localhost python3.9[278167]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.375 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.377 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.408 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:22 localhost nova_compute[230600]: 2025-11-23 09:42:22.409 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:22 localhost python3.9[278275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:23 localhost python3.9[278361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890942.4376733-3039-234281955856509/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16622 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB39200000000001030307) Nov 23 04:42:24 localhost sshd[278433]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:42:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:42:25 localhost podman[278473]: 2025-11-23 09:42:25.02861655 +0000 UTC m=+0.078505872 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:42:25 localhost podman[278473]: 2025-11-23 09:42:25.038495663 +0000 UTC m=+0.088385005 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:42:25 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:42:25 localhost python3.9[278471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:25 localhost podman[278472]: 2025-11-23 09:42:25.092112834 +0000 UTC m=+0.142170151 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:42:25 localhost podman[278472]: 2025-11-23 09:42:25.113484793 +0000 UTC m=+0.163542160 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:42:25 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:42:25 localhost python3.9[278613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890944.2910163-3039-96479055692771/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=c48862f04c3bb6bb101bc9efe68e434d3f83ed7a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:26 localhost python3.9[278721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:27 localhost python3.9[278807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890945.7078178-3039-133532006022135/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.410 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.411 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.411 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.412 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.446 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:27 localhost nova_compute[230600]: 2025-11-23 09:42:27.447 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:27 localhost python3.9[278915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:28 localhost python3.9[279001]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763890947.465684-3039-234719898134482/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:29 localhost python3.9[279111]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:42:29 localhost python3.9[279221]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:42:29 localhost openstack_network_exporter[242668]: ERROR 09:42:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:42:29 localhost openstack_network_exporter[242668]: ERROR 09:42:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:42:29 localhost openstack_network_exporter[242668]: ERROR 09:42:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:42:29 localhost openstack_network_exporter[242668]: ERROR 09:42:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:42:29 localhost openstack_network_exporter[242668]: Nov 23 04:42:29 localhost openstack_network_exporter[242668]: ERROR 09:42:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:42:29 localhost openstack_network_exporter[242668]: Nov 23 04:42:30 localhost python3.9[279331]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:31 localhost python3.9[279443]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:42:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16623 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB5A200000000001030307) Nov 23 04:42:32 localhost python3.9[279551]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.448 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.453 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.480 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:32 localhost nova_compute[230600]: 2025-11-23 09:42:32.482 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:33 localhost python3.9[279661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:33 localhost python3.9[279716]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:34 localhost python3.9[279824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 23 04:42:34 localhost python3.9[279879]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 23 04:42:35 localhost python3.9[279989]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 23 04:42:36 localhost python3.9[280099]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:42:37 localhost python3[280209]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:42:37 localhost python3[280209]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.483 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.484 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.485 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.485 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.517 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:37 localhost nova_compute[230600]: 2025-11-23 09:42:37.517 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:42:38 localhost podman[280386]: 2025-11-23 09:42:38.252557712 +0000 UTC m=+0.096423149 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:42:38 localhost podman[280386]: 2025-11-23 09:42:38.285834298 +0000 UTC m=+0.129699765 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Nov 23 04:42:38 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:42:38 localhost python3.9[280385]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:40 localhost python3.9[280516]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 23 04:42:40 localhost python3.9[280626]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 23 04:42:41 localhost podman[240668]: time="2025-11-23T09:42:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:42:41 localhost podman[240668]: @ - - [23/Nov/2025:09:42:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:42:41 localhost podman[240668]: @ - - [23/Nov/2025:09:42:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 23 04:42:42 localhost python3[280736]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.518 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.520 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.520 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.521 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.581 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:42 localhost nova_compute[230600]: 2025-11-23 09:42:42.582 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:42 localhost python3[280736]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66",#012 "Digest": "sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-21T06:33:31.011385583Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211770748,#012 "VirtualSize": 1211770748,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/bba04dbd2e113925874f779bcae6bca7316b7a66f7b274a8553d6a317f393238/diff:/var/lib/containers/storage/overlay/0f9d0c5706eafcb7728267ea9fd9ea64ca261add44f9daa2b074c99c0b87c98c/diff:/var/lib/containers/storage/overlay/6e9f200c79821db3abfada9ff652f9bd648429ed9bddf6ca26f58a14a261f068/diff:/var/lib/containers/storage/overlay/ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5d77d5f2ee8b05529dee3e4c9abd16672fdc0c09b8af133b2b88b75f7f780254/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:ccfb371f2e163f0c4b39cf6c44930e270547d620598331da99955639b81e1ba6",#012 "sha256:573e98f577c8b1610c1485067040ff856a142394fcd22ad4cb9c66b7d1de6bef",#012 "sha256:5a71e5d7d31f15255619cb8b9384b708744757c93993652418b0f45b0c0931d5",#012 "sha256:b9b598b1eb0c08906fe1bc9a64fc0e72719a6197d83669d2eb4309e69a00aa62",#012 "sha256:33e3811ab7487b27336fdf94252d5a875b17efb438cbc4ffc943f851ad3eceb6"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251118",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "7b76510d5d5adf2ccf627d29bb9dae76",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-18T01:56:49.795434035Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:6d427dd138d2b0977a7ef7feaa8bd82d04e99cc5f4a16d555d6cff0cb52d43c6 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:49.795512415Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251118\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-18T01:56:52.547242013Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-21T06:10:01.947310748Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947327778Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947358359Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.947372589Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94738527Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:01.94739397Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:02.324930938Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-21T06:10:36.349393468Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 23 04:42:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:42:43 localhost podman[280910]: 2025-11-23 09:42:43.473424827 +0000 UTC m=+0.081006751 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:42:43 localhost podman[280910]: 2025-11-23 09:42:43.483205377 +0000 UTC m=+0.090787301 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:42:43 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:42:43 localhost python3.9[280909]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:44 localhost python3.9[281045]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:42:45 localhost python3.9[281154]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763890964.4477117-3716-179323596515358/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:42:45 localhost python3.9[281209]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:42:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35664 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB928B0000000001030307) Nov 23 04:42:47 localhost python3.9[281319]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35665 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB96A10000000001030307) Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.583 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.585 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.622 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:47 localhost nova_compute[230600]: 2025-11-23 09:42:47.623 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:42:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:42:48 localhost python3.9[281427]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:48 localhost systemd[1]: tmp-crun.IFA7lD.mount: Deactivated successfully. Nov 23 04:42:48 localhost podman[281428]: 2025-11-23 09:42:48.012826211 +0000 UTC m=+0.074982870 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, tcib_managed=true) Nov 23 04:42:48 localhost podman[281428]: 2025-11-23 09:42:48.072395181 +0000 UTC m=+0.134551820 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 04:42:48 localhost podman[281429]: 2025-11-23 09:42:48.080494787 +0000 UTC m=+0.139312590 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 23 04:42:48 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:42:48 localhost podman[281429]: 2025-11-23 09:42:48.102359531 +0000 UTC m=+0.161177404 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9) Nov 23 04:42:48 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:42:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16624 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB9A200000000001030307) Nov 23 04:42:49 localhost python3.9[281580]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 23 04:42:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35666 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DB9EA00000000001030307) Nov 23 04:42:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:42:50 localhost podman[281690]: 2025-11-23 09:42:50.291675623 +0000 UTC m=+0.082401536 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 23 04:42:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3801 DF PROTO=TCP SPT=49840 DPT=9102 SEQ=1940737006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBA2210000000001030307) Nov 23 04:42:50 localhost podman[281690]: 2025-11-23 09:42:50.32436562 +0000 UTC m=+0.115091533 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 23 04:42:50 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:42:50 localhost python3.9[281691]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 23 04:42:50 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation. Nov 23 04:42:50 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:42:50 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:42:50 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:42:51 localhost python3.9[281846]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 23 04:42:51 localhost systemd[1]: Stopping nova_compute container... Nov 23 04:42:51 localhost nova_compute[230600]: 2025-11-23 09:42:51.642 230604 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.624 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.662 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.662 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.663 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.664 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:42:52 localhost nova_compute[230600]: 2025-11-23 09:42:52.665 230604 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:42:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35667 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBAE600000000001030307) Nov 23 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:42:55 localhost podman[281862]: 2025-11-23 09:42:55.278119541 +0000 UTC m=+0.086285689 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:42:55 localhost podman[281862]: 2025-11-23 09:42:55.291334139 +0000 UTC m=+0.099500317 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:42:55 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:42:55 localhost systemd[1]: tmp-crun.4WIm1L.mount: Deactivated successfully. Nov 23 04:42:55 localhost podman[281863]: 2025-11-23 09:42:55.343210886 +0000 UTC m=+0.147604615 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:42:55 localhost podman[281863]: 2025-11-23 09:42:55.357351244 +0000 UTC m=+0.161744963 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:42:55 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:42:56 localhost nova_compute[230600]: 2025-11-23 09:42:56.347 230604 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Nov 23 04:42:56 localhost nova_compute[230600]: 2025-11-23 09:42:56.349 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:42:56 localhost nova_compute[230600]: 2025-11-23 09:42:56.350 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:42:56 localhost nova_compute[230600]: 2025-11-23 09:42:56.350 230604 DEBUG oslo_concurrency.lockutils [None req-90a04629-260b-4727-9520-1ada8d98ff24 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:42:56 localhost journal[203731]: End of file while reading data: Input/output error Nov 23 04:42:56 localhost systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Deactivated successfully. Nov 23 04:42:56 localhost systemd[1]: libpod-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295.scope: Consumed 20.521s CPU time. Nov 23 04:42:56 localhost podman[281850]: 2025-11-23 09:42:56.838290981 +0000 UTC m=+5.263057655 container died 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 23 04:42:56 localhost systemd[1]: tmp-crun.yxR82I.mount: Deactivated successfully. Nov 23 04:42:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295-userdata-shm.mount: Deactivated successfully. Nov 23 04:42:56 localhost podman[281850]: 2025-11-23 09:42:56.995648824 +0000 UTC m=+5.420415448 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 04:42:56 localhost podman[281850]: nova_compute Nov 23 04:42:57 localhost podman[281935]: error opening file `/run/crun/2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295/status`: No such file or directory Nov 23 04:42:57 localhost podman[281922]: 2025-11-23 09:42:57.102317028 +0000 UTC m=+0.067078759 container cleanup 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute) Nov 23 04:42:57 localhost podman[281922]: nova_compute Nov 23 04:42:57 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 23 04:42:57 localhost systemd[1]: Stopped nova_compute container. Nov 23 04:42:57 localhost systemd[1]: Starting nova_compute container... Nov 23 04:42:57 localhost systemd[1]: Started libcrun container. Nov 23 04:42:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58e025b5dcab7116f52b38d1c7989a99ff2d7d38180457b2af2c58b52580126b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:57 localhost podman[281937]: 2025-11-23 09:42:57.256918963 +0000 UTC m=+0.123121637 container init 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:42:57 localhost podman[281937]: 2025-11-23 09:42:57.264016799 +0000 UTC m=+0.130219463 container start 2368fa9994341f809481e89fbf864c92d903bf1e2d73a4834f85f72619bf2295 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:42:57 localhost podman[281937]: nova_compute Nov 23 04:42:57 localhost nova_compute[281952]: + sudo -E kolla_set_configs Nov 23 04:42:57 localhost systemd[1]: Started nova_compute container. Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Validating config file Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying service configuration files Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /etc/ceph Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Creating directory /etc/ceph Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Writing out command to execute Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:42:57 localhost nova_compute[281952]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 23 04:42:57 localhost nova_compute[281952]: ++ cat /run_command Nov 23 04:42:57 localhost nova_compute[281952]: + CMD=nova-compute Nov 23 04:42:57 localhost nova_compute[281952]: + ARGS= Nov 23 04:42:57 localhost nova_compute[281952]: + sudo kolla_copy_cacerts Nov 23 04:42:57 localhost nova_compute[281952]: + [[ ! -n '' ]] Nov 23 04:42:57 localhost nova_compute[281952]: + . kolla_extend_start Nov 23 04:42:57 localhost nova_compute[281952]: Running command: 'nova-compute' Nov 23 04:42:57 localhost nova_compute[281952]: + echo 'Running command: '\''nova-compute'\''' Nov 23 04:42:57 localhost nova_compute[281952]: + umask 0022 Nov 23 04:42:57 localhost nova_compute[281952]: + exec nova-compute Nov 23 04:42:58 localhost python3.9[282073]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 23 04:42:58 localhost systemd[1]: Started libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope. Nov 23 04:42:58 localhost systemd[1]: Started libcrun container. Nov 23 04:42:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 23 04:42:58 localhost podman[282101]: 2025-11-23 09:42:58.339834181 +0000 UTC m=+0.128157017 container init 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Nov 23 04:42:58 localhost podman[282101]: 2025-11-23 09:42:58.353835096 +0000 UTC m=+0.142157922 container start 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:42:58 localhost python3.9[282073]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Applying nova statedir ownership Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/355032bc-9946-4f6d-817c-2bfc8694d41d/console.log Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f8def1b80727f8e5cc38a877010a5f81bbb3086d Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f8def1b80727f8e5cc38a877010a5f81bbb3086d Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/4143dbbec5b08621aa3c8eb364f8a7d3e97604e18b7ed41c4bab0da11ed561fd Nov 23 04:42:58 localhost nova_compute_init[282121]: INFO:nova_statedir:Nova statedir ownership complete Nov 23 04:42:58 localhost systemd[1]: libpod-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully. Nov 23 04:42:58 localhost podman[282122]: 2025-11-23 09:42:58.431749148 +0000 UTC m=+0.052242859 container died 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 04:42:58 localhost podman[282135]: 2025-11-23 09:42:58.498551106 +0000 UTC m=+0.063978600 container cleanup 72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:42:58 localhost systemd[1]: libpod-conmon-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428.scope: Deactivated successfully. Nov 23 04:42:58 localhost systemd[1]: var-lib-containers-storage-overlay-ebae8217f4c299e37918070df4d4a8097501b071398f708c6cd605e08ee59209-merged.mount: Deactivated successfully. Nov 23 04:42:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72aeb29e6bc441495455bb3bc01b405470b6f5787f50d11e280b06195eace428-userdata-shm.mount: Deactivated successfully. Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.064 281956 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.065 281956 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.065 281956 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.065 281956 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.182 281956 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.204 281956 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.205 281956 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 23 04:42:59 localhost systemd[1]: session-59.scope: Deactivated successfully. Nov 23 04:42:59 localhost systemd[1]: session-59.scope: Consumed 1min 28.823s CPU time. Nov 23 04:42:59 localhost systemd-logind[761]: Session 59 logged out. Waiting for processes to exit. Nov 23 04:42:59 localhost systemd-logind[761]: Removed session 59. Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.632 281956 INFO nova.virt.driver [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.731 281956 INFO nova.compute.provider_config [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.737 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.737 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_concurrency.lockutils [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.738 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.739 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.740 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.741 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console_host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.742 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.743 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.744 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] host = np0005532585.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.745 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.746 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.747 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.748 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.749 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.750 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.751 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.752 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.753 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] my_block_storage_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] my_ip = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.754 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.755 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.756 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.757 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.758 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.759 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.760 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.761 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.762 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.763 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.764 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.765 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.766 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.767 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.768 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.769 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.770 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.771 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.772 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.773 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.774 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.775 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.776 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.777 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.778 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.779 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.780 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.781 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.782 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.783 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.784 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.785 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.786 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.787 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.787 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.788 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.788 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.789 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.790 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.791 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.792 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.793 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.794 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.795 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.796 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.797 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.798 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.799 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.800 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.801 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.802 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.803 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.804 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.805 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.806 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.807 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.808 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.809 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.810 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.811 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.812 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.813 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.814 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.815 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.816 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.817 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.818 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.819 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.819 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.820 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.820 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.821 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.822 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.823 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.824 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 WARNING oslo_config.cfg [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 23 04:42:59 localhost nova_compute[281952]: live_migration_uri is deprecated for removal in favor of two other options that Nov 23 04:42:59 localhost nova_compute[281952]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 23 04:42:59 localhost nova_compute[281952]: and ``live_migration_inbound_addr`` respectively. Nov 23 04:42:59 localhost nova_compute[281952]: ). Its value may be silently ignored in the future.#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.825 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.826 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.827 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_secret_uuid = 46550e70-79cb-5f55-bf6d-1204b97e083b log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.828 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.829 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.830 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.831 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.832 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.833 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.834 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.835 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.836 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.837 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.838 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.839 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.840 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.841 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.842 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.843 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.844 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.845 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.846 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.847 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.848 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.849 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.850 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.851 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.852 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.853 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.854 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.855 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.856 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.857 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.858 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.859 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.860 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.861 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.862 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.863 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.864 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.865 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.866 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.867 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.868 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.869 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.870 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.871 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.872 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.873 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.874 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.875 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.876 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.877 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.878 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.879 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.880 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.881 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.882 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.883 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.884 281956 DEBUG oslo_service.service [None req-9c7993f6-8c7b-4e6a-8a39-96158e1b25a8 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.885 281956 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.898 281956 INFO nova.virt.node [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.899 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.899 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.900 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.900 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.910 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.913 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.914 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.919 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host capabilities Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: 43895caf-e6c2-47af-84a5-6194e901da5c Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: x86_64 Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome-v4 Nov 23 04:42:59 localhost nova_compute[281952]: AMD Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: tcp Nov 23 04:42:59 localhost nova_compute[281952]: rdma Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: 16116612 Nov 23 04:42:59 localhost nova_compute[281952]: 4029153 Nov 23 04:42:59 localhost nova_compute[281952]: 0 Nov 23 04:42:59 localhost nova_compute[281952]: 0 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: selinux Nov 23 04:42:59 localhost nova_compute[281952]: 0 Nov 23 04:42:59 localhost nova_compute[281952]: system_u:system_r:svirt_t:s0 Nov 23 04:42:59 localhost nova_compute[281952]: system_u:system_r:svirt_tcg_t:s0 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: dac Nov 23 04:42:59 localhost nova_compute[281952]: 0 Nov 23 04:42:59 localhost nova_compute[281952]: +107:+107 Nov 23 04:42:59 localhost nova_compute[281952]: +107:+107 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: hvm Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: 32 Nov 23 04:42:59 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:42:59 localhost nova_compute[281952]: pc-i440fx-rhel7.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.8.0 Nov 23 04:42:59 localhost nova_compute[281952]: q35 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.4.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.5.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.3.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel7.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.4.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.2.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.2.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.0.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.0.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.1.0 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: hvm Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: 64 Nov 23 04:42:59 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:42:59 localhost nova_compute[281952]: pc-i440fx-rhel7.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.8.0 Nov 23 04:42:59 localhost nova_compute[281952]: q35 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.4.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.5.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.3.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel7.6.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.4.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.2.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.2.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.0.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.0.0 Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel8.1.0 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: #033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.925 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:42:59 localhost nova_compute[281952]: 2025-11-23 09:42:59.928 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:42:59 localhost nova_compute[281952]: kvm Nov 23 04:42:59 localhost nova_compute[281952]: pc-q35-rhel9.8.0 Nov 23 04:42:59 localhost nova_compute[281952]: i686 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: rom Nov 23 04:42:59 localhost nova_compute[281952]: pflash Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: yes Nov 23 04:42:59 localhost nova_compute[281952]: no Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: no Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: on Nov 23 04:42:59 localhost nova_compute[281952]: off Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: on Nov 23 04:42:59 localhost nova_compute[281952]: off Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:42:59 localhost nova_compute[281952]: AMD Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: 486 Nov 23 04:42:59 localhost nova_compute[281952]: 486-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-noTSX Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-noTSX-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Broadwell-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-noTSX Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cascadelake-Server-v5 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Conroe Nov 23 04:42:59 localhost nova_compute[281952]: Conroe-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Cooperlake Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cooperlake-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Cooperlake-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Denverton Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Denverton-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Denverton-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Denverton-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Dhyana Nov 23 04:42:59 localhost nova_compute[281952]: Dhyana-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Dhyana-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Genoa Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Genoa-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-IBPB Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Milan Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Milan-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Milan-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-Rome-v4 Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-v1 Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-v2 Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: EPYC-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: GraniteRapids Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: GraniteRapids-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: GraniteRapids-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-noTSX Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-noTSX-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Haswell-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-noTSX Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v5 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v6 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Icelake-Server-v7 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: IvyBridge Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: IvyBridge-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: IvyBridge-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: IvyBridge-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: KnightsMill Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: KnightsMill-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nehalem Nov 23 04:42:59 localhost nova_compute[281952]: Nehalem-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nehalem-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nehalem-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G1 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G1-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G2 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G2-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G3 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G3-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G4-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G5 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Opteron_G5-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Penryn Nov 23 04:42:59 localhost nova_compute[281952]: Penryn-v1 Nov 23 04:42:59 localhost nova_compute[281952]: SandyBridge Nov 23 04:42:59 localhost nova_compute[281952]: SandyBridge-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: SandyBridge-v1 Nov 23 04:42:59 localhost nova_compute[281952]: SandyBridge-v2 Nov 23 04:42:59 localhost nova_compute[281952]: SapphireRapids Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: SapphireRapids-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: SapphireRapids-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: SapphireRapids-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: SierraForest Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: SierraForest-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-noTSX-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Client-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-noTSX-IBRS Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-v4 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Skylake-Server-v5 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Snowridge Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Snowridge-v1 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Snowridge-v2 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Snowridge-v3 Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:42:59 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Westmere Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v2 Nov 23 04:43:00 localhost nova_compute[281952]: athlon Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: athlon-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: kvm32 Nov 23 04:43:00 localhost nova_compute[281952]: kvm32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: n270 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: n270-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pentium Nov 23 04:43:00 localhost nova_compute[281952]: pentium-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: phenom Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: phenom-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu32 Nov 23 04:43:00 localhost nova_compute[281952]: qemu32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: anonymous Nov 23 04:43:00 localhost nova_compute[281952]: memfd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: disk Nov 23 04:43:00 localhost nova_compute[281952]: cdrom Nov 23 04:43:00 localhost nova_compute[281952]: floppy Nov 23 04:43:00 localhost nova_compute[281952]: lun Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: fdc Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: sata Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: vnc Nov 23 04:43:00 localhost nova_compute[281952]: egl-headless Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: subsystem Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: mandatory Nov 23 04:43:00 localhost nova_compute[281952]: requisite Nov 23 04:43:00 localhost nova_compute[281952]: optional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: pci Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: random Nov 23 04:43:00 localhost nova_compute[281952]: egd Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: path Nov 23 04:43:00 localhost nova_compute[281952]: handle Nov 23 04:43:00 localhost nova_compute[281952]: virtiofs Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tpm-tis Nov 23 04:43:00 localhost nova_compute[281952]: tpm-crb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: emulator Nov 23 04:43:00 localhost nova_compute[281952]: external Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 2.0 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: passt Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: isa Nov 23 04:43:00 localhost nova_compute[281952]: hyperv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: null Nov 23 04:43:00 localhost nova_compute[281952]: vc Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: dev Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: pipe Nov 23 04:43:00 localhost nova_compute[281952]: stdio Nov 23 04:43:00 localhost nova_compute[281952]: udp Nov 23 04:43:00 localhost nova_compute[281952]: tcp Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: qemu-vdagent Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: relaxed Nov 23 04:43:00 localhost nova_compute[281952]: vapic Nov 23 04:43:00 localhost nova_compute[281952]: spinlocks Nov 23 04:43:00 localhost nova_compute[281952]: vpindex Nov 23 04:43:00 localhost nova_compute[281952]: runtime Nov 23 04:43:00 localhost nova_compute[281952]: synic Nov 23 04:43:00 localhost nova_compute[281952]: stimer Nov 23 04:43:00 localhost nova_compute[281952]: reset Nov 23 04:43:00 localhost nova_compute[281952]: vendor_id Nov 23 04:43:00 localhost nova_compute[281952]: frequencies Nov 23 04:43:00 localhost nova_compute[281952]: reenlightenment Nov 23 04:43:00 localhost nova_compute[281952]: tlbflush Nov 23 04:43:00 localhost nova_compute[281952]: ipi Nov 23 04:43:00 localhost nova_compute[281952]: avic Nov 23 04:43:00 localhost nova_compute[281952]: emsr_bitmap Nov 23 04:43:00 localhost nova_compute[281952]: xmm_input Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 4095 Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Linux KVM Hv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tdx Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:42:59.931 281956 DEBUG nova.virt.libvirt.volume.mount [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:42:59.936 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:43:00 localhost nova_compute[281952]: kvm Nov 23 04:43:00 localhost nova_compute[281952]: pc-i440fx-rhel7.6.0 Nov 23 04:43:00 localhost nova_compute[281952]: i686 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost openstack_network_exporter[242668]: ERROR 09:42:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:43:00 localhost openstack_network_exporter[242668]: ERROR 09:42:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: rom Nov 23 04:43:00 localhost nova_compute[281952]: pflash Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: yes Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: AMD Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost openstack_network_exporter[242668]: ERROR 09:42:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 486 Nov 23 04:43:00 localhost nova_compute[281952]: 486-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-IBRS Nov 23 04:43:00 localhost openstack_network_exporter[242668]: ERROR 09:42:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost openstack_network_exporter[242668]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost openstack_network_exporter[242668]: ERROR 09:42:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost openstack_network_exporter[242668]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Conroe Nov 23 04:43:00 localhost nova_compute[281952]: Conroe-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-IBPB Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v4 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v1 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v2 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v6 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v7 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Penryn Nov 23 04:43:00 localhost nova_compute[281952]: Penryn-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Westmere Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v2 Nov 23 04:43:00 localhost nova_compute[281952]: athlon Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: athlon-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: kvm32 Nov 23 04:43:00 localhost nova_compute[281952]: kvm32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: n270 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: n270-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pentium Nov 23 04:43:00 localhost nova_compute[281952]: pentium-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: phenom Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: phenom-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu32 Nov 23 04:43:00 localhost nova_compute[281952]: qemu32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: anonymous Nov 23 04:43:00 localhost nova_compute[281952]: memfd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: disk Nov 23 04:43:00 localhost nova_compute[281952]: cdrom Nov 23 04:43:00 localhost nova_compute[281952]: floppy Nov 23 04:43:00 localhost nova_compute[281952]: lun Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: ide Nov 23 04:43:00 localhost nova_compute[281952]: fdc Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: sata Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: vnc Nov 23 04:43:00 localhost nova_compute[281952]: egl-headless Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: subsystem Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: mandatory Nov 23 04:43:00 localhost nova_compute[281952]: requisite Nov 23 04:43:00 localhost nova_compute[281952]: optional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: pci Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: random Nov 23 04:43:00 localhost nova_compute[281952]: egd Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: path Nov 23 04:43:00 localhost nova_compute[281952]: handle Nov 23 04:43:00 localhost nova_compute[281952]: virtiofs Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tpm-tis Nov 23 04:43:00 localhost nova_compute[281952]: tpm-crb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: emulator Nov 23 04:43:00 localhost nova_compute[281952]: external Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 2.0 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: passt Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: isa Nov 23 04:43:00 localhost nova_compute[281952]: hyperv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: null Nov 23 04:43:00 localhost nova_compute[281952]: vc Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: dev Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: pipe Nov 23 04:43:00 localhost nova_compute[281952]: stdio Nov 23 04:43:00 localhost nova_compute[281952]: udp Nov 23 04:43:00 localhost nova_compute[281952]: tcp Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: qemu-vdagent Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: relaxed Nov 23 04:43:00 localhost nova_compute[281952]: vapic Nov 23 04:43:00 localhost nova_compute[281952]: spinlocks Nov 23 04:43:00 localhost nova_compute[281952]: vpindex Nov 23 04:43:00 localhost nova_compute[281952]: runtime Nov 23 04:43:00 localhost nova_compute[281952]: synic Nov 23 04:43:00 localhost nova_compute[281952]: stimer Nov 23 04:43:00 localhost nova_compute[281952]: reset Nov 23 04:43:00 localhost nova_compute[281952]: vendor_id Nov 23 04:43:00 localhost nova_compute[281952]: frequencies Nov 23 04:43:00 localhost nova_compute[281952]: reenlightenment Nov 23 04:43:00 localhost nova_compute[281952]: tlbflush Nov 23 04:43:00 localhost nova_compute[281952]: ipi Nov 23 04:43:00 localhost nova_compute[281952]: avic Nov 23 04:43:00 localhost nova_compute[281952]: emsr_bitmap Nov 23 04:43:00 localhost nova_compute[281952]: xmm_input Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 4095 Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Linux KVM Hv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tdx Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:42:59.978 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:42:59.983 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:43:00 localhost nova_compute[281952]: kvm Nov 23 04:43:00 localhost nova_compute[281952]: pc-q35-rhel9.8.0 Nov 23 04:43:00 localhost nova_compute[281952]: x86_64 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: efi Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: rom Nov 23 04:43:00 localhost nova_compute[281952]: pflash Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: yes Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: yes Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: AMD Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 486 Nov 23 04:43:00 localhost nova_compute[281952]: 486-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Conroe Nov 23 04:43:00 localhost nova_compute[281952]: Conroe-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-IBPB Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v4 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v1 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v2 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v6 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v7 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Penryn Nov 23 04:43:00 localhost nova_compute[281952]: Penryn-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Westmere Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v2 Nov 23 04:43:00 localhost nova_compute[281952]: athlon Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: athlon-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: kvm32 Nov 23 04:43:00 localhost nova_compute[281952]: kvm32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: n270 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: n270-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pentium Nov 23 04:43:00 localhost nova_compute[281952]: pentium-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: phenom Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: phenom-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu32 Nov 23 04:43:00 localhost nova_compute[281952]: qemu32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: anonymous Nov 23 04:43:00 localhost nova_compute[281952]: memfd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: disk Nov 23 04:43:00 localhost nova_compute[281952]: cdrom Nov 23 04:43:00 localhost nova_compute[281952]: floppy Nov 23 04:43:00 localhost nova_compute[281952]: lun Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: fdc Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: sata Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: vnc Nov 23 04:43:00 localhost nova_compute[281952]: egl-headless Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: subsystem Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: mandatory Nov 23 04:43:00 localhost nova_compute[281952]: requisite Nov 23 04:43:00 localhost nova_compute[281952]: optional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: pci Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: random Nov 23 04:43:00 localhost nova_compute[281952]: egd Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: path Nov 23 04:43:00 localhost nova_compute[281952]: handle Nov 23 04:43:00 localhost nova_compute[281952]: virtiofs Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tpm-tis Nov 23 04:43:00 localhost nova_compute[281952]: tpm-crb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: emulator Nov 23 04:43:00 localhost nova_compute[281952]: external Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 2.0 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: passt Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: isa Nov 23 04:43:00 localhost nova_compute[281952]: hyperv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: null Nov 23 04:43:00 localhost nova_compute[281952]: vc Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: dev Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: pipe Nov 23 04:43:00 localhost nova_compute[281952]: stdio Nov 23 04:43:00 localhost nova_compute[281952]: udp Nov 23 04:43:00 localhost nova_compute[281952]: tcp Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: qemu-vdagent Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: relaxed Nov 23 04:43:00 localhost nova_compute[281952]: vapic Nov 23 04:43:00 localhost nova_compute[281952]: spinlocks Nov 23 04:43:00 localhost nova_compute[281952]: vpindex Nov 23 04:43:00 localhost nova_compute[281952]: runtime Nov 23 04:43:00 localhost nova_compute[281952]: synic Nov 23 04:43:00 localhost nova_compute[281952]: stimer Nov 23 04:43:00 localhost nova_compute[281952]: reset Nov 23 04:43:00 localhost nova_compute[281952]: vendor_id Nov 23 04:43:00 localhost nova_compute[281952]: frequencies Nov 23 04:43:00 localhost nova_compute[281952]: reenlightenment Nov 23 04:43:00 localhost nova_compute[281952]: tlbflush Nov 23 04:43:00 localhost nova_compute[281952]: ipi Nov 23 04:43:00 localhost nova_compute[281952]: avic Nov 23 04:43:00 localhost nova_compute[281952]: emsr_bitmap Nov 23 04:43:00 localhost nova_compute[281952]: xmm_input Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 4095 Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Linux KVM Hv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tdx Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.044 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: /usr/libexec/qemu-kvm Nov 23 04:43:00 localhost nova_compute[281952]: kvm Nov 23 04:43:00 localhost nova_compute[281952]: pc-i440fx-rhel7.6.0 Nov 23 04:43:00 localhost nova_compute[281952]: x86_64 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: rom Nov 23 04:43:00 localhost nova_compute[281952]: pflash Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: yes Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: no Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: AMD Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 486 Nov 23 04:43:00 localhost nova_compute[281952]: 486-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Broadwell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cascadelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Conroe Nov 23 04:43:00 localhost nova_compute[281952]: Conroe-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Cooperlake-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Denverton-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Dhyana-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Genoa-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-IBPB Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Milan-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-Rome-v4 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v1 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v2 Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: EPYC-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: GraniteRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Haswell-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-noTSX Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v6 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Icelake-Server-v7 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: IvyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: KnightsMill-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nehalem-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G1-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G4-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Opteron_G5-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Penryn Nov 23 04:43:00 localhost nova_compute[281952]: Penryn-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: SandyBridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SapphireRapids-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: SierraForest-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Client-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-noTSX-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Skylake-Server-v5 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v2 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v3 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Snowridge-v4 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Westmere Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-IBRS Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Westmere-v2 Nov 23 04:43:00 localhost nova_compute[281952]: athlon Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: athlon-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: core2duo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: coreduo-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: kvm32 Nov 23 04:43:00 localhost nova_compute[281952]: kvm32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64 Nov 23 04:43:00 localhost nova_compute[281952]: kvm64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: n270 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: n270-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pentium Nov 23 04:43:00 localhost nova_compute[281952]: pentium-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2 Nov 23 04:43:00 localhost nova_compute[281952]: pentium2-v1 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3 Nov 23 04:43:00 localhost nova_compute[281952]: pentium3-v1 Nov 23 04:43:00 localhost nova_compute[281952]: phenom Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: phenom-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu32 Nov 23 04:43:00 localhost nova_compute[281952]: qemu32-v1 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64 Nov 23 04:43:00 localhost nova_compute[281952]: qemu64-v1 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: anonymous Nov 23 04:43:00 localhost nova_compute[281952]: memfd Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: disk Nov 23 04:43:00 localhost nova_compute[281952]: cdrom Nov 23 04:43:00 localhost nova_compute[281952]: floppy Nov 23 04:43:00 localhost nova_compute[281952]: lun Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: ide Nov 23 04:43:00 localhost nova_compute[281952]: fdc Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: sata Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: vnc Nov 23 04:43:00 localhost nova_compute[281952]: egl-headless Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: subsystem Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: mandatory Nov 23 04:43:00 localhost nova_compute[281952]: requisite Nov 23 04:43:00 localhost nova_compute[281952]: optional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: pci Nov 23 04:43:00 localhost nova_compute[281952]: scsi Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: virtio Nov 23 04:43:00 localhost nova_compute[281952]: virtio-transitional Nov 23 04:43:00 localhost nova_compute[281952]: virtio-non-transitional Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: random Nov 23 04:43:00 localhost nova_compute[281952]: egd Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: path Nov 23 04:43:00 localhost nova_compute[281952]: handle Nov 23 04:43:00 localhost nova_compute[281952]: virtiofs Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tpm-tis Nov 23 04:43:00 localhost nova_compute[281952]: tpm-crb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: emulator Nov 23 04:43:00 localhost nova_compute[281952]: external Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 2.0 Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: usb Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: qemu Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: builtin Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: default Nov 23 04:43:00 localhost nova_compute[281952]: passt Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: isa Nov 23 04:43:00 localhost nova_compute[281952]: hyperv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: null Nov 23 04:43:00 localhost nova_compute[281952]: vc Nov 23 04:43:00 localhost nova_compute[281952]: pty Nov 23 04:43:00 localhost nova_compute[281952]: dev Nov 23 04:43:00 localhost nova_compute[281952]: file Nov 23 04:43:00 localhost nova_compute[281952]: pipe Nov 23 04:43:00 localhost nova_compute[281952]: stdio Nov 23 04:43:00 localhost nova_compute[281952]: udp Nov 23 04:43:00 localhost nova_compute[281952]: tcp Nov 23 04:43:00 localhost nova_compute[281952]: unix Nov 23 04:43:00 localhost nova_compute[281952]: qemu-vdagent Nov 23 04:43:00 localhost nova_compute[281952]: dbus Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: relaxed Nov 23 04:43:00 localhost nova_compute[281952]: vapic Nov 23 04:43:00 localhost nova_compute[281952]: spinlocks Nov 23 04:43:00 localhost nova_compute[281952]: vpindex Nov 23 04:43:00 localhost nova_compute[281952]: runtime Nov 23 04:43:00 localhost nova_compute[281952]: synic Nov 23 04:43:00 localhost nova_compute[281952]: stimer Nov 23 04:43:00 localhost nova_compute[281952]: reset Nov 23 04:43:00 localhost nova_compute[281952]: vendor_id Nov 23 04:43:00 localhost nova_compute[281952]: frequencies Nov 23 04:43:00 localhost nova_compute[281952]: reenlightenment Nov 23 04:43:00 localhost nova_compute[281952]: tlbflush Nov 23 04:43:00 localhost nova_compute[281952]: ipi Nov 23 04:43:00 localhost nova_compute[281952]: avic Nov 23 04:43:00 localhost nova_compute[281952]: emsr_bitmap Nov 23 04:43:00 localhost nova_compute[281952]: xmm_input Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: 4095 Nov 23 04:43:00 localhost nova_compute[281952]: on Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: off Nov 23 04:43:00 localhost nova_compute[281952]: Linux KVM Hv Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: tdx Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: Nov 23 04:43:00 localhost nova_compute[281952]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.100 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.100 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Secure Boot support detected#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.103 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.103 281956 INFO nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.113 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.165 281956 INFO nova.virt.node [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Determined node identity dae70d62-10f4-474c-9782-8c926a3641d5 from /var/lib/nova/compute_id#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.185 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Verified node dae70d62-10f4-474c-9782-8c926a3641d5 matches my host np0005532585.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.221 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.226 281956 DEBUG nova.virt.libvirt.vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005532585.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-23T08:25:43Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.226 281956 DEBUG nova.network.os_vif_util [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.227 281956 DEBUG nova.network.os_vif_util [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.227 281956 DEBUG os_vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.256 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.258 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.273 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.274 281956 INFO oslo.privsep.daemon [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpkwoipk_5/privsep.sock']#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.909 281956 INFO oslo.privsep.daemon [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.793 282209 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.796 282209 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.798 282209 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 23 04:43:00 localhost nova_compute[281952]: 2025-11-23 09:43:00.798 282209 INFO oslo.privsep.daemon [-] privsep daemon running as pid 282209#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.173 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.174 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.175 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.176 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.177 281956 INFO os_vif [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.178 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.182 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.182 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.332 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.332 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.333 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.333 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.334 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:43:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35668 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DBCE210000000001030307) Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.794 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.854 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:43:01 localhost nova_compute[281952]: 2025-11-23 09:43:01.855 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.054 281956 WARNING nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.055 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12139MB free_disk=41.837242126464844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.056 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.056 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.211 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.212 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.212 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.262 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.283 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.284 281956 DEBUG nova.compute.provider_tree [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.299 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.326 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.367 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.696 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.815 281956 DEBUG oslo_concurrency.processutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.821 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 23 04:43:02 localhost nova_compute[281952]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.821 281956 INFO nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.823 281956 DEBUG nova.compute.provider_tree [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.823 281956 DEBUG nova.virt.libvirt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.846 281956 DEBUG nova.scheduler.client.report [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.873 281956 DEBUG nova.compute.resource_tracker [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.874 281956 DEBUG oslo_concurrency.lockutils [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.874 281956 DEBUG nova.service [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.916 281956 DEBUG nova.service [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 23 04:43:02 localhost nova_compute[281952]: 2025-11-23 09:43:02.917 281956 DEBUG nova.servicegroup.drivers.db [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] DB_Driver: join new ServiceGroup member np0005532585.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 23 04:43:05 localhost nova_compute[281952]: 2025-11-23 09:43:05.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:07 localhost nova_compute[281952]: 2025-11-23 09:43:07.697 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:43:09 localhost podman[282257]: 2025-11-23 09:43:09.023694281 +0000 UTC m=+0.078549543 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:43:09 localhost podman[282257]: 2025-11-23 09:43:09.059395074 +0000 UTC m=+0.114250336 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:43:09 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:43:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:09.278 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:09.279 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:10 localhost nova_compute[281952]: 2025-11-23 09:43:10.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.805 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1347736452 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 205057051 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4361a4ac-d1f6-4808-ab2b-4597871f5210', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1347736452, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.806740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d2f1c6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'a269661f0ee2858a54e0ecbf8ebc0b6d90876bb1274e52f94f0dc3a9b6774648'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 205057051, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.806740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d306a2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '4a000ddad1e31ce03a61bda0127cbe512825519da9ec936d7fd3ba08a4dbfd13'}]}, 'timestamp': '2025-11-23 09:43:10.847203', '_unique_id': '93806cc4d323418da332008aae43b952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ce40652-eb4e-4642-a9ed-696d8f1a6da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.850752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d5a6b4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'c01688a41477b1e96ff18ebe34cfa5835994b204ec1b291b52d9ffe7ad9b8162'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.850752', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d5b802-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'be1bf89ce744fa444230ea5a61209d1541c77989c16a95979e45cc4ec73df90f'}]}, 'timestamp': '2025-11-23 09:43:10.864796', '_unique_id': '4d7fc9067855439e8a34e8b8f428c882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5323a5f8-32da-45b9-aea8-92cb3bc66b7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.867198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2d6272e-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'cdab8db7192a6e45e1c4ba9b6a17d83bb4aa195272a798eca95fbefeff365a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.867198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2d6375a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'b6f3c09d9025af838fa4cc1729c28f2c48b61ac5ed9049177a04dc67065296eb'}]}, 'timestamp': '2025-11-23 09:43:10.868092', '_unique_id': '2fb610ffd94040339e486f7e02fefc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '272dce7d-8c2e-4708-8e41-1af8e80374e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:43:10.870277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd2d9f156-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.069545347, 'message_signature': '494d1bb3299b32c549bc1b7410190ceda1000d5969e7bffce4397d35c2cf4612'}]}, 'timestamp': '2025-11-23 09:43:10.892489', '_unique_id': '867b7e2eeefe4a9997295558410dea44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.894 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90a23bf2-ddac-4254-abbe-2f22e4d1a6af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.894807', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dad60c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '313a0f7138823e7fd065b1224341842ee42ca6543e2f1384f4461aa89ed1643f'}]}, 'timestamp': '2025-11-23 09:43:10.898361', '_unique_id': 'da0e8d81d13d40e7833f15ab6c17c145'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44a9fec5-3d73-4e83-b7c7-bec1174bcc7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.903743', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dbbeb4-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '7d7f2cdcce7f17e31be264720b0b22afa331205aecbe8628d1da339502acd227'}]}, 'timestamp': '2025-11-23 09:43:10.904377', '_unique_id': 'bc37623302a54bfdbdccac1f76ddb6ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 73900032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20595a1c-f0ec-4137-895d-102d7e4086b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73900032, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.906699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2dc2f34-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '291a74733e966047c4bae18d39f4dd21c92d92872bd022bc04cb6933fb0dcbb9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.906699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2dc3f42-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'ef66d200807c30836e79e3eca407f6147c535ce19285bdce44eed4ace0b467e8'}]}, 'timestamp': '2025-11-23 09:43:10.907572', '_unique_id': 'e3786650428f4b7cb972fb06608b98f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f50885d6-8ae2-4ec9-97a6-61893f31a241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.909740', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dca61c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '3496f07dc720f06479f3c3d06dd6fe9501190f49447482adc6954ccf8e848624'}]}, 'timestamp': '2025-11-23 09:43:10.910237', '_unique_id': '398163b1296546d88f0abc3c79f60500'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eeeaf314-8038-485a-968a-51e26589aabf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.912314', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dd092c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '16b5e4eb609294ac5c3c39f440ffaed9dcd395d7c883215ad23a8cac993635f3'}]}, 'timestamp': '2025-11-23 09:43:10.912770', '_unique_id': 'c54b75cf934648da9490e1e363d8048c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97cdaf7a-631d-4de3-940f-d033a0f16b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.915028', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dd7344-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '76de5e55d3b3427913a93fc019c6bd46be519df415c92aeb1f00f81aa83565cf'}]}, 'timestamp': '2025-11-23 09:43:10.915485', '_unique_id': '40eca1e0810248c0984cbba9d6480ba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '237cb085-3816-4101-b36b-706ec51dfde9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.917603', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2ddd7ee-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'd012e57d5372642e75ec7578658c74393de784723bc671270d29504af0fe0e90'}]}, 'timestamp': '2025-11-23 09:43:10.918093', '_unique_id': '5262ff0f656b42bea01b0105444073eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 9228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca8633a6-99c2-4a2c-a176-a5ba6dd83f36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9228, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.920416', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2de45c6-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'c4eb61756bc0e4fda8ad358fa7eac580a38f8e97befb0e00df8b12b789d6f051'}]}, 'timestamp': '2025-11-23 09:43:10.920875', '_unique_id': 'fe3794814c7247b38340bd8cad77a393'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7751415b-e7af-4579-85cd-ef43e85c8de6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.922982', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2dea9f8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '3101fcb5f752857e882f2d31eb6f2f18e9acd20e6bbd42dd8cc70115486ab9d3'}]}, 'timestamp': '2025-11-23 09:43:10.923438', '_unique_id': 'aafeda26c2ef4e8fae29711450922fbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bc00876-a49a-4221-aec8-1606a1d5891d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.925679', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2df144c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '92a3f9a0a2533f8f43d3e597fa72f12aab4a7ec68b7bab28857149f2f23798da'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.925679', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2df2464-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '1dbe35117817771416e3219350573833d95ca127e9c9a306bf8df05b15009f8b'}]}, 'timestamp': '2025-11-23 09:43:10.926548', '_unique_id': 'b3d3e100ca7941f18b944b3115c0053d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e81be2e8-871a-44d0-a2e4-47e41b763b49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.928689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2df8a58-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'b97fe498954d841c1f993b8fb20c9f0e1d18f34e293b93f02cb60851a351dac0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.928689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2df9a66-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'e7d14cd4daa47b2abe0f5476cb8bffad0888ec6f54a42b1819bf40603ccb71dd'}]}, 'timestamp': '2025-11-23 09:43:10.929566', '_unique_id': '3095d66a4e51402b883fbf8f0d722d8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '669abfae-1a43-4114-bcbe-aa7bfdd9cedc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.931660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2dffcb8-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '2220bd0d26ccb22ad510ef1a94f8d67e9b7e341488af15fe9428de7b9394d18c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.931660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e00e10-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': 'd943988a2479e0e01eb96ff0a8d10cf838ad0d62bf09ec02dbf7b424da3b5a67'}]}, 'timestamp': '2025-11-23 09:43:10.932524', '_unique_id': '22093e80f134447aaff8984f3c86b0aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 165450591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 35057587 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f65cc3b-bf46-44db-8edd-577eadee27e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165450591, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.934617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2e0703a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '2b841e6bba26f330c6360cb810b3c85ef55d99d08dacc21f363d9a39c927683b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 35057587, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.934617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e0814c-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10933.984370925, 'message_signature': '0adc945062d8d5ed83099f1fb10c5af702046d78ed8f97672772b7ec40a7a42f'}]}, 'timestamp': '2025-11-23 09:43:10.935476', '_unique_id': '3c4d4e67581644a68d42d8aca6bc0448'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fb7379-5f59-4ab1-ac1c-01d96d862f79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.937582', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2e0e100-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': 'f1fae702f9f8258a5c5463b89bddec1adc9ba8b816bcfe74064a84c7631ac91b'}]}, 'timestamp': '2025-11-23 09:43:10.937869', '_unique_id': 'c8254b63173a497d8d0913201c07264e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.938 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3d7e32f-a1cb-42e5-b896-a81a66713352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:43:10.939312', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'd2e1248a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.072435978, 'message_signature': '41d0cf7b07a3a96b0a5f47f98fc7ea611da666956314e033dd7093e2267f5e6e'}]}, 'timestamp': '2025-11-23 09:43:10.939599', '_unique_id': 'af4bef20eb644841ae5f6c5f652029ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 56830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49dafec8-6a67-4859-aee0-235dfcdcc8b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56830000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:43:10.941060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd2e16a3a-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.069545347, 'message_signature': '80ef20397d813304505d806fefe6f7598282fbf459f8c598e72fc273c6e9d78c'}]}, 'timestamp': '2025-11-23 09:43:10.941376', '_unique_id': 'a86bceec592541d9b03d3aad745ca634'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0462217f-29ee-4ef2-88b8-668963483d1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:43:10.942787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2e1acf2-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': '0342ff63d73a69c3e8201d32315202878cfabe7d974ab68cf706350b3f24211a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:43:10.942787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd2e1b6fc-c850-11f0-bde4-fa163e72a351', 'monotonic_time': 10934.028390642, 'message_signature': 'ca8359ca883818e3440fc68c4dd49f36ac9ec4e4d427dfbc09c5fd093824ac8a'}]}, 'timestamp': '2025-11-23 09:43:10.943330', '_unique_id': '3f4aa4aff7c144cf93817fe09bd28c12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:43:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:43:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:43:11 localhost podman[240668]: time="2025-11-23T09:43:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:43:11 localhost podman[240668]: @ - - [23/Nov/2025:09:43:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 23 04:43:11 localhost podman[240668]: @ - - [23/Nov/2025:09:43:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1" Nov 23 04:43:12 localhost nova_compute[281952]: 2025-11-23 09:43:12.700 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:43:14 localhost systemd[1]: tmp-crun.ZcvpVX.mount: Deactivated successfully. Nov 23 04:43:14 localhost podman[282275]: 2025-11-23 09:43:14.013460124 +0000 UTC m=+0.067216544 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:43:14 localhost podman[282275]: 2025-11-23 09:43:14.021550421 +0000 UTC m=+0.075306871 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:43:14 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:43:15 localhost nova_compute[281952]: 2025-11-23 09:43:15.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61215 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC07BB0000000001030307) Nov 23 04:43:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61216 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC0BE00000000001030307) Nov 23 04:43:17 localhost nova_compute[281952]: 2025-11-23 09:43:17.702 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35669 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC0E200000000001030307) Nov 23 04:43:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:43:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:43:19 localhost podman[282298]: 2025-11-23 09:43:19.0263548 +0000 UTC m=+0.081836617 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true) Nov 23 04:43:19 localhost podman[282299]: 2025-11-23 09:43:19.060989249 +0000 UTC m=+0.118596414 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:43:19 localhost podman[282298]: 2025-11-23 09:43:19.06447521 +0000 UTC m=+0.119956997 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:43:19 localhost podman[282299]: 2025-11-23 09:43:19.075002583 +0000 UTC m=+0.132609848 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc.) Nov 23 04:43:19 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:43:19 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:43:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61217 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC13E00000000001030307) Nov 23 04:43:20 localhost nova_compute[281952]: 2025-11-23 09:43:20.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16625 DF PROTO=TCP SPT=44832 DPT=9102 SEQ=715820687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC18200000000001030307) Nov 23 04:43:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:43:21 localhost nova_compute[281952]: 2025-11-23 09:43:21.019 281956 DEBUG nova.compute.manager [None req-3fb1259b-0712-46f4-b2a6-c039d9b9743a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:21 localhost nova_compute[281952]: 2025-11-23 09:43:21.025 281956 INFO nova.compute.manager [None req-3fb1259b-0712-46f4-b2a6-c039d9b9743a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Retrieving diagnostics#033[00m Nov 23 04:43:21 localhost systemd[1]: tmp-crun.0nbALV.mount: Deactivated successfully. Nov 23 04:43:21 localhost podman[282342]: 2025-11-23 09:43:21.039504513 +0000 UTC m=+0.096502153 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:43:21 localhost podman[282342]: 2025-11-23 09:43:21.045217135 +0000 UTC m=+0.102214775 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:43:21 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:43:22 localhost nova_compute[281952]: 2025-11-23 09:43:22.705 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61218 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC23A00000000001030307) Nov 23 04:43:25 localhost nova_compute[281952]: 2025-11-23 09:43:25.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:43:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:43:25 localhost systemd[1]: tmp-crun.p0IRj8.mount: Deactivated successfully. Nov 23 04:43:25 localhost podman[282396]: 2025-11-23 09:43:25.527170475 +0000 UTC m=+0.094022504 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:43:25 localhost podman[282397]: 2025-11-23 09:43:25.572959588 +0000 UTC m=+0.137732771 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:43:25 localhost podman[282397]: 2025-11-23 09:43:25.585269398 +0000 UTC m=+0.150042661 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:43:25 localhost podman[282396]: 2025-11-23 09:43:25.592758046 +0000 UTC m=+0.159610125 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:43:25 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:43:25 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:43:26 localhost nova_compute[281952]: 2025-11-23 09:43:26.950 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:26 localhost nova_compute[281952]: 2025-11-23 09:43:26.950 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:26 localhost nova_compute[281952]: 2025-11-23 09:43:26.951 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:26 localhost nova_compute[281952]: 2025-11-23 09:43:26.958 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Nov 23 04:43:26 localhost nova_compute[281952]: 2025-11-23 09:43:26.963 281956 DEBUG nova.objects.instance [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'flavor' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:43:27 localhost nova_compute[281952]: 2025-11-23 09:43:27.010 281956 DEBUG nova.virt.libvirt.driver [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Nov 23 04:43:27 localhost podman[282547]: Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.094063579 +0000 UTC m=+0.082529200 container create f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux ) Nov 23 04:43:27 localhost systemd[1]: Started libpod-conmon-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope. Nov 23 04:43:27 localhost systemd[1]: Started libcrun container. Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.060789703 +0000 UTC m=+0.049255354 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.165317359 +0000 UTC m=+0.153782970 container init f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, distribution-scope=public, release=553, ceph=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.173569241 +0000 UTC m=+0.162034862 container start f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=553, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc.) Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.173806178 +0000 UTC m=+0.162271829 container attach f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:43:27 localhost interesting_bell[282563]: 167 167 Nov 23 04:43:27 localhost systemd[1]: libpod-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope: Deactivated successfully. Nov 23 04:43:27 localhost podman[282547]: 2025-11-23 09:43:27.178830088 +0000 UTC m=+0.167295699 container died f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, version=7, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55) Nov 23 04:43:27 localhost podman[282568]: 2025-11-23 09:43:27.279965817 +0000 UTC m=+0.089483120 container remove f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_bell, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, vcs-type=git) Nov 23 04:43:27 localhost systemd[1]: libpod-conmon-f621b37985675285fe708b80cadefbab1d9c8a64fcdc17909d1360230605f18b.scope: Deactivated successfully. Nov 23 04:43:27 localhost podman[282588]: Nov 23 04:43:27 localhost podman[282588]: 2025-11-23 09:43:27.476879175 +0000 UTC m=+0.070956343 container create c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64) Nov 23 04:43:27 localhost systemd[1]: Started libpod-conmon-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope. Nov 23 04:43:27 localhost podman[282588]: 2025-11-23 09:43:27.440760679 +0000 UTC m=+0.034837907 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:43:27 localhost systemd[1]: Started libcrun container. Nov 23 04:43:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 23 04:43:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 04:43:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:43:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:43:27 localhost podman[282588]: 2025-11-23 09:43:27.557606395 +0000 UTC m=+0.151683533 container init c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:43:27 localhost podman[282588]: 2025-11-23 09:43:27.566592461 +0000 UTC m=+0.160669589 container start c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph) Nov 23 04:43:27 localhost podman[282588]: 2025-11-23 09:43:27.566718445 +0000 UTC m=+0.160795573 container attach c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-type=git, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:43:27 localhost nova_compute[281952]: 2025-11-23 09:43:27.745 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:28 localhost systemd[1]: tmp-crun.wL4Dfp.mount: Deactivated successfully. Nov 23 04:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-824838b0f865a8cf79c7f6d113ee63e6ec310f4f437e24dd14bd36cff738817e-merged.mount: Deactivated successfully. Nov 23 04:43:28 localhost reverent_lovelace[282603]: [ Nov 23 04:43:28 localhost reverent_lovelace[282603]: { Nov 23 04:43:28 localhost reverent_lovelace[282603]: "available": false, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "ceph_device": false, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "lsm_data": {}, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "lvs": [], Nov 23 04:43:28 localhost reverent_lovelace[282603]: "path": "/dev/sr0", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "rejected_reasons": [ Nov 23 04:43:28 localhost reverent_lovelace[282603]: "Has a FileSystem", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "Insufficient space (<5GB)" Nov 23 04:43:28 localhost reverent_lovelace[282603]: ], Nov 23 04:43:28 localhost reverent_lovelace[282603]: "sys_api": { Nov 23 04:43:28 localhost reverent_lovelace[282603]: "actuators": null, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "device_nodes": "sr0", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "human_readable_size": "482.00 KB", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "id_bus": "ata", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "model": "QEMU DVD-ROM", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "nr_requests": "2", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "partitions": {}, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "path": "/dev/sr0", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "removable": "1", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "rev": "2.5+", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "ro": "0", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "rotational": "1", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "sas_address": "", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "sas_device_handle": "", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "scheduler_mode": "mq-deadline", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "sectors": 0, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "sectorsize": "2048", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "size": 493568.0, Nov 23 04:43:28 localhost reverent_lovelace[282603]: "support_discard": "0", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "type": "disk", Nov 23 04:43:28 localhost reverent_lovelace[282603]: "vendor": "QEMU" Nov 23 04:43:28 localhost reverent_lovelace[282603]: } Nov 23 04:43:28 localhost reverent_lovelace[282603]: } Nov 23 04:43:28 localhost reverent_lovelace[282603]: ] Nov 23 04:43:28 localhost systemd[1]: libpod-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Deactivated successfully. Nov 23 04:43:28 localhost systemd[1]: libpod-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Consumed 1.111s CPU time. Nov 23 04:43:28 localhost podman[282588]: 2025-11-23 09:43:28.638953714 +0000 UTC m=+1.233030882 container died c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.33.12, name=rhceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph) Nov 23 04:43:28 localhost systemd[1]: tmp-crun.rtO6lp.mount: Deactivated successfully. Nov 23 04:43:28 localhost systemd[1]: var-lib-containers-storage-overlay-076e663a7bea42517dcd1c9453df2377380d6649bb581eb4e9f2e0361f10b820-merged.mount: Deactivated successfully. Nov 23 04:43:28 localhost podman[284596]: 2025-11-23 09:43:28.726258534 +0000 UTC m=+0.076785787 container remove c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_lovelace, io.buildah.version=1.33.12, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, RELEASE=main, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Nov 23 04:43:28 localhost systemd[1]: libpod-conmon-c23ecde4a00036f511e75ae6e02ab341fe029f814fe1658f9f07ddb703afadb5.scope: Deactivated successfully. Nov 23 04:43:29 localhost kernel: device tapd3912d14-a3 left promiscuous mode Nov 23 04:43:29 localhost NetworkManager[5975]: [1763891009.5415] device (tapd3912d14-a3): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00049|binding|INFO|Releasing lport d3912d14-a3e0-4df9-b811-f3bd90f44559 from this chassis (sb_readonly=0) Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00050|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 down in Southbound Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00051|binding|INFO|Removing iface tapd3912d14-a3 ovn-installed in OVS Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Nov 23 04:43:29 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 58.825s CPU time. Nov 23 04:43:29 localhost systemd-machined[84275]: Machine qemu-1-instance-00000002 terminated. Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.572 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005532585.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.573 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 unbound from our chassis#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.574 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcac49fc-c589-475a-91a8-00a0ba9c2b33, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-c237ed-0 Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-49b8a0-0 Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00054|ovn_bfd|INFO|Disabled BFD on interface ovn-b7d5b3-0 Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.578 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00055|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.587 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa74992-c96b-4a6b-a9d0-f2713d799b6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.588 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 namespace which is not needed anymore#033[00m Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.616 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost ovn_controller[154788]: 2025-11-23T09:43:29Z|00056|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.624 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost systemd[1]: libpod-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope: Deactivated successfully. Nov 23 04:43:29 localhost podman[284660]: 2025-11-23 09:43:29.756156139 +0000 UTC m=+0.061684268 container died 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 23 04:43:29 localhost NetworkManager[5975]: [1763891009.7709] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Nov 23 04:43:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799-userdata-shm.mount: Deactivated successfully. Nov 23 04:43:29 localhost systemd[1]: var-lib-containers-storage-overlay-086324888e3aef5fa52615eb760fec66e6f8cc0743a0416438a6f968d544d4e3-merged.mount: Deactivated successfully. Nov 23 04:43:29 localhost podman[284660]: 2025-11-23 09:43:29.895769959 +0000 UTC m=+0.201297978 container cleanup 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 23 04:43:29 localhost podman[284674]: 2025-11-23 09:43:29.90837378 +0000 UTC m=+0.140128827 container cleanup 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 23 04:43:29 localhost systemd[1]: libpod-conmon-6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799.scope: Deactivated successfully. Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.919 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.946 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.947 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:43:29 localhost podman[284702]: 2025-11-23 09:43:29.984372131 +0000 UTC m=+0.058138956 container remove 6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 23 04:43:29 localhost openstack_network_exporter[242668]: ERROR 09:43:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:29 localhost openstack_network_exporter[242668]: ERROR 09:43:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:43:29 localhost openstack_network_exporter[242668]: ERROR 09:43:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:29 localhost openstack_network_exporter[242668]: ERROR 09:43:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:43:29 localhost openstack_network_exporter[242668]: Nov 23 04:43:29 localhost openstack_network_exporter[242668]: ERROR 09:43:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:43:29 localhost openstack_network_exporter[242668]: Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.993 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e176f012-1286-4a0e-80f6-5a1e970efbe1]: (4, ('Sun Nov 23 09:43:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 (6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799)\n6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799\nSun Nov 23 09:43:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 (6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799)\n6baea6c360ebb97c21fb4dc7b3d3e2d86de3db19ea6ea9f305323ad11fa29799\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.995 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[594b1121-f757-4c13-ae31-1ce59239cc7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:29.996 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:43:29 localhost nova_compute[281952]: 2025-11-23 09:43:29.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:29 localhost kernel: device tapbcac49fc-c0 left promiscuous mode Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.011 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.015 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[faf44cef-00e9-46d1-82dd-49279abfae56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.030 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[36c0fc65-c876-41c4-93fa-f1ce7c339b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.031 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[47f6f33d-f049-4c6d-b2e9-4ab181c93a42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.031 281956 INFO nova.virt.libvirt.driver [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance shutdown successfully after 3 seconds.#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.039 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance destroyed successfully.#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.039 281956 DEBUG nova.objects.instance [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'numa_topology' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.044 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c49a33-6f16-454b-acfc-708faa0fb3cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 628638, 'reachable_time': 28165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284722, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.054 281956 DEBUG nova.compute.manager [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.058 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.058 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d37ffa-cdeb-432d-8530-503ee71aea89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.149 281956 DEBUG oslo_concurrency.lockutils [None req-110bc03d-370d-4353-b9af-ededf4a3795c 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.151 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.151 281956 INFO nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] During sync_power_state the instance has a pending task (powering-off). Skip.#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.152 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.228 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:43:30 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:30.229 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.246 281956 DEBUG nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.247 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.247 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.248 281956 DEBUG oslo_concurrency.lockutils [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.249 281956 DEBUG nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.249 281956 WARNING nova.compute.manager [req-c32259b2-46f4-4814-9e6e-11bccd2fb90a req-94523dc9-1def-4c49-88a9-8cb2dc708b9e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-unplugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state None.#033[00m Nov 23 04:43:30 localhost nova_compute[281952]: 2025-11-23 09:43:30.271 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:30 localhost systemd[1]: run-netns-ovnmeta\x2dbcac49fc\x2dc589\x2d475a\x2d91a8\x2d00a0ba9c2b33.mount: Deactivated successfully. Nov 23 04:43:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61219 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC44200000000001030307) Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.307 281956 DEBUG nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.308 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.309 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.309 281956 DEBUG oslo_concurrency.lockutils [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.310 281956 DEBUG nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.311 281956 WARNING nova.compute.manager [req-7aff7542-404f-4928-8cd0-9bcf2dfeb3bd req-fe2912aa-58e6-41d4-9fc0-b91b2f1d0ac7 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state None.#033[00m Nov 23 04:43:32 localhost nova_compute[281952]: 2025-11-23 09:43:32.792 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.164 281956 DEBUG nova.compute.manager [None req-d184e85c-1645-4f7d-8584-cc69d8ebefd8 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server [None req-d184e85c-1645-4f7d-8584-cc69d8ebefd8 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server raise self.value Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server raise self.value Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 23 04:43:33 localhost nova_compute[281952]: 2025-11-23 09:43:33.190 281956 ERROR oslo_messaging.rpc.server #033[00m Nov 23 04:43:33 localhost ovn_metadata_agent[160434]: 2025-11-23 09:43:33.231 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:43:35 localhost nova_compute[281952]: 2025-11-23 09:43:35.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:37 localhost nova_compute[281952]: 2025-11-23 09:43:37.830 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:38 localhost sshd[284725]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:43:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:43:39 localhost systemd[1]: tmp-crun.tRgasO.mount: Deactivated successfully. Nov 23 04:43:39 localhost podman[284727]: 2025-11-23 09:43:39.324158978 +0000 UTC m=+0.100223221 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:43:39 localhost podman[284727]: 2025-11-23 09:43:39.334460715 +0000 UTC m=+0.110524948 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:43:39 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:43:40 localhost nova_compute[281952]: 2025-11-23 09:43:40.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:41 localhost podman[240668]: time="2025-11-23T09:43:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:43:41 localhost podman[240668]: @ - - [23/Nov/2025:09:43:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146336 "" "Go-http-client/1.1" Nov 23 04:43:41 localhost podman[240668]: @ - - [23/Nov/2025:09:43:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16751 "" "Go-http-client/1.1" Nov 23 04:43:42 localhost nova_compute[281952]: 2025-11-23 09:43:42.875 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:44 localhost nova_compute[281952]: 2025-11-23 09:43:44.794 281956 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:43:44 localhost nova_compute[281952]: 2025-11-23 09:43:44.795 281956 INFO nova.compute.manager [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Stopped (Lifecycle Event)#033[00m Nov 23 04:43:44 localhost nova_compute[281952]: 2025-11-23 09:43:44.823 281956 DEBUG nova.compute.manager [None req-f4f1784e-c02a-443d-89f6-786bfa75742a - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:44 localhost nova_compute[281952]: 2025-11-23 09:43:44.827 281956 DEBUG nova.compute.manager [None req-f4f1784e-c02a-443d-89f6-786bfa75742a - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:43:45 localhost podman[284748]: 2025-11-23 09:43:45.031651992 +0000 UTC m=+0.086176726 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:43:45 localhost podman[284748]: 2025-11-23 09:43:45.041246626 +0000 UTC m=+0.095771340 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:43:45 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:43:45 localhost nova_compute[281952]: 2025-11-23 09:43:45.277 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23202 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC7CEB0000000001030307) Nov 23 04:43:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23203 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC80E00000000001030307) Nov 23 04:43:47 localhost nova_compute[281952]: 2025-11-23 09:43:47.907 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61220 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC84210000000001030307) Nov 23 04:43:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23204 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC88E00000000001030307) Nov 23 04:43:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:43:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:43:50 localhost systemd[1]: tmp-crun.LYLlw0.mount: Deactivated successfully. Nov 23 04:43:50 localhost podman[284771]: 2025-11-23 09:43:50.046364655 +0000 UTC m=+0.101909583 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:43:50 localhost podman[284771]: 2025-11-23 09:43:50.123423411 +0000 UTC m=+0.178968419 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:43:50 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:43:50 localhost podman[284772]: 2025-11-23 09:43:50.149221209 +0000 UTC m=+0.201858985 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:43:50 localhost podman[284772]: 2025-11-23 09:43:50.165132613 +0000 UTC m=+0.217770379 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 04:43:50 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:43:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35670 DF PROTO=TCP SPT=60284 DPT=9102 SEQ=4122196956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC8C200000000001030307) Nov 23 04:43:50 localhost nova_compute[281952]: 2025-11-23 09:43:50.279 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:51 localhost systemd[1]: tmp-crun.Ga6aAJ.mount: Deactivated successfully. Nov 23 04:43:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:43:52 localhost podman[284817]: 2025-11-23 09:43:52.024706493 +0000 UTC m=+0.078380548 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:43:52 localhost podman[284817]: 2025-11-23 09:43:52.058292439 +0000 UTC m=+0.111966554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:43:52 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:43:52 localhost nova_compute[281952]: 2025-11-23 09:43:52.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.083 281956 DEBUG nova.compute.manager [None req-404a0de6-47df-4c5b-8b06-aa6d86925514 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server [None req-404a0de6-47df-4c5b-8b06-aa6d86925514 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server raise self.value Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server raise self.value Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 355032bc-9946-4f6d-817c-2bfc8694d41d in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 23 04:43:53 localhost nova_compute[281952]: 2025-11-23 09:43:53.107 281956 ERROR oslo_messaging.rpc.server #033[00m Nov 23 04:43:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23205 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DC98A00000000001030307) Nov 23 04:43:55 localhost nova_compute[281952]: 2025-11-23 09:43:55.280 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:43:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:43:56 localhost podman[284835]: 2025-11-23 09:43:56.037568411 +0000 UTC m=+0.081827957 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:43:56 localhost podman[284835]: 2025-11-23 09:43:56.081135263 +0000 UTC m=+0.125394809 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:43:56 localhost systemd[1]: tmp-crun.f3qqTu.mount: Deactivated successfully. Nov 23 04:43:56 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:43:56 localhost podman[284834]: 2025-11-23 09:43:56.100679903 +0000 UTC m=+0.146853810 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:43:56 localhost podman[284834]: 2025-11-23 09:43:56.111885819 +0000 UTC m=+0.158059736 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:43:56 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:43:57 localhost nova_compute[281952]: 2025-11-23 09:43:57.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:43:59 localhost nova_compute[281952]: 2025-11-23 09:43:59.253 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:43:59 localhost nova_compute[281952]: 2025-11-23 09:43:59.253 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:43:59 localhost nova_compute[281952]: 2025-11-23 09:43:59.254 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:43:59 localhost nova_compute[281952]: 2025-11-23 09:43:59.254 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:43:59 localhost ovn_controller[154788]: 2025-11-23T09:43:59Z|00057|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory Nov 23 04:43:59 localhost openstack_network_exporter[242668]: ERROR 09:43:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:59 localhost openstack_network_exporter[242668]: ERROR 09:43:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:43:59 localhost openstack_network_exporter[242668]: ERROR 09:43:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:43:59 localhost openstack_network_exporter[242668]: ERROR 09:43:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:43:59 localhost openstack_network_exporter[242668]: Nov 23 04:43:59 localhost openstack_network_exporter[242668]: ERROR 09:43:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:43:59 localhost openstack_network_exporter[242668]: Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.065 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.066 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.138 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'flavor' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.176 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.914 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.944 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.944 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG nova.network.neutron [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.945 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.948 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.949 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.949 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.950 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.951 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.980 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.981 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:44:00 localhost nova_compute[281952]: 2025-11-23 09:44:00.982 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.415 281956 DEBUG nova.network.neutron [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.432 281956 DEBUG oslo_concurrency.lockutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.457 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance destroyed successfully.#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.458 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'numa_topology' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23206 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCB8200000000001030307) Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.466 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.472 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'resources' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.498 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.499 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.499 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.500 281956 DEBUG os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.502 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.502 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3912d14-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.556 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.556 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.556 281956 INFO os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.559 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.559 281956 INFO nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] UEFI support detected#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.565 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Start _get_guest_xml network_info=[{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=be9a09b1-b916-4d06-9bcd-d8b8afdf9284,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}], 'ephemerals': [{'encryption_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vdb', 'size': 1, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.568 281956 WARNING nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.571 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.571 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.573 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.573 281956 DEBUG nova.virt.libvirt.host [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T08:24:36Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='8c32de12-b44b-4285-8afc-2a1d7f236d32',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=be9a09b1-b916-4d06-9bcd-d8b8afdf9284,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.574 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.575 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.576 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.577 281956 DEBUG nova.virt.hardware [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.577 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.592 281956 DEBUG nova.privsep.utils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.592 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.761 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12558MB free_disk=41.83708190917969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.763 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.837 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.838 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.838 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:44:01 localhost nova_compute[281952]: 2025-11-23 09:44:01.876 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.056 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.059 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.337 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.343 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.357 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.498 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.499 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.511 281956 DEBUG oslo_concurrency.processutils [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.513 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.513 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.515 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.517 281956 DEBUG nova.objects.instance [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Lazy-loading 'pci_devices' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.540 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] End _get_guest_xml xml= Nov 23 04:44:02 localhost nova_compute[281952]: 355032bc-9946-4f6d-817c-2bfc8694d41d Nov 23 04:44:02 localhost nova_compute[281952]: instance-00000002 Nov 23 04:44:02 localhost nova_compute[281952]: 524288 Nov 23 04:44:02 localhost nova_compute[281952]: 1 Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: test Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:01 Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: 512 Nov 23 04:44:02 localhost nova_compute[281952]: 1 Nov 23 04:44:02 localhost nova_compute[281952]: 0 Nov 23 04:44:02 localhost nova_compute[281952]: 1 Nov 23 04:44:02 localhost nova_compute[281952]: 1 Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: admin Nov 23 04:44:02 localhost nova_compute[281952]: admin Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: RDO Nov 23 04:44:02 localhost nova_compute[281952]: OpenStack Compute Nov 23 04:44:02 localhost nova_compute[281952]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 23 04:44:02 localhost nova_compute[281952]: 355032bc-9946-4f6d-817c-2bfc8694d41d Nov 23 04:44:02 localhost nova_compute[281952]: 355032bc-9946-4f6d-817c-2bfc8694d41d Nov 23 04:44:02 localhost nova_compute[281952]: Virtual Machine Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: hvm Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: /dev/urandom Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: Nov 23 04:44:02 localhost nova_compute[281952]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.543 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.543 281956 DEBUG nova.virt.libvirt.driver [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.545 281956 DEBUG nova.virt.libvirt.vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T08:25:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005532585.localdomain',hostname='test',id=2,image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T08:25:43Z,launched_on='np0005532585.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='1915d3e5d4254231a0517e2dcf35848f',ramdisk_id='',reservation_id='r-i8g0t7xr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='be9a09b1-b916-4d06-9bcd-d8b8afdf9284',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-23T09:43:30Z,user_data=None,user_id='7e40ee99e6034be7be796ae12095c154',uuid=355032bc-9946-4f6d-817c-2bfc8694d41d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.546 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converting VIF {"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.547 281956 DEBUG nova.network.os_vif_util [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.547 281956 DEBUG os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.549 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.550 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.554 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3912d14-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.555 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3912d14-a3, col_values=(('external_ids', {'iface-id': 'd3912d14-a3e0-4df9-b811-f3bd90f44559', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:aa:3b', 'vm-uuid': '355032bc-9946-4f6d-817c-2bfc8694d41d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.595 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.598 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.603 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.604 281956 INFO os_vif [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:aa:3b,bridge_name='br-int',has_traffic_filtering=True,id=d3912d14-a3e0-4df9-b811-f3bd90f44559,network=Network(bcac49fc-c589-475a-91a8-00a0ba9c2b33),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3912d14-a3')#033[00m Nov 23 04:44:02 localhost systemd[1]: Started libvirt secret daemon. Nov 23 04:44:02 localhost kernel: device tapd3912d14-a3 entered promiscuous mode Nov 23 04:44:02 localhost NetworkManager[5975]: [1763891042.7148] manager: (tapd3912d14-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/16) Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00058|binding|INFO|Claiming lport d3912d14-a3e0-4df9-b811-f3bd90f44559 for this chassis. Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00059|binding|INFO|d3912d14-a3e0-4df9-b811-f3bd90f44559: Claiming fa:16:3e:cf:aa:3b 192.168.0.77 Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.718 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost systemd-udevd[284995]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost NetworkManager[5975]: [1763891042.7442] device (tapd3912d14-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 04:44:02 localhost NetworkManager[5975]: [1763891042.7452] device (tapd3912d14-a3): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-c237ed-0 Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-49b8a0-0 Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00062|ovn_bfd|INFO|Enabled BFD on interface ovn-b7d5b3-0 Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.754 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:aa:3b 192.168.0.77'], port_security=['fa:16:3e:cf:aa:3b 192.168.0.77'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.77/24', 'neutron:device_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '1915d3e5d4254231a0517e2dcf35848f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '4fe931b0-155c-49e2-b5a5-44d74fa72e9e 6afcee2e-50ee-4b3c-9d1f-24ea7a5a850b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ca41f3-3e94-4959-b1b5-1e81bd2c9bc1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d3912d14-a3e0-4df9-b811-f3bd90f44559) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.756 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d3912d14-a3e0-4df9-b811-f3bd90f44559 in datapath bcac49fc-c589-475a-91a8-00a0ba9c2b33 bound to our chassis#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.758 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bcac49fc-c589-475a-91a8-00a0ba9c2b33#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.762 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.768 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.772 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[89b4bc42-32af-4e20-99c9-fce330d9c74c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.773 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbcac49fc-c1 in ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.776 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbcac49fc-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.776 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d043fc-51df-4719-85bb-ec34cce5872e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.779 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[95cccfcf-2ea0-492b-8aec-0fa00525775d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00063|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 up in Southbound Nov 23 04:44:02 localhost ovn_controller[154788]: 2025-11-23T09:44:02Z|00064|binding|INFO|Setting lport d3912d14-a3e0-4df9-b811-f3bd90f44559 ovn-installed in OVS Nov 23 04:44:02 localhost systemd-machined[84275]: New machine qemu-2-instance-00000002. Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.789 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[e78fff9a-5566-47f7-b09f-cd7fcc162f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.792 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.821 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[082065d0-ee28-4f90-8d12-9838ee506690]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.849 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.853 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.858 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[78abc1ad-e494-4f82-b974-4ef0b83d13f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost NetworkManager[5975]: [1763891042.8651] manager: (tapbcac49fc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/17) Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.863 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[336c127b-fbb3-4e8b-9a3e-0cc831c49941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.902 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[bc800149-cc74-4ac1-b303-ee061ad4ee5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.906 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[934fe60d-bab2-4e06-a032-557f4081d776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost NetworkManager[5975]: [1763891042.9295] device (tapbcac49fc-c0): carrier: link connected Nov 23 04:44:02 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c1: link becomes ready Nov 23 04:44:02 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbcac49fc-c0: link becomes ready Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.935 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[4c493eaf-ade0-4737-b409-5ad4d361e3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.955 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[293fbf38-0cc7-4810-8b30-05eb8731c6ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1098603, 'reachable_time': 42896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285040, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.973 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4c5024f5-b189-4646-b3e0-9f213e408f12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:b28b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1098603, 'tstamp': 1098603}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285049, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:02 localhost nova_compute[281952]: 2025-11-23 09:44:02.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:02.994 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[66efae39-f2b5-4c63-8701-aceae1a65e66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbcac49fc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:b2:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1098603, 'reachable_time': 42896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285053, 'error': None, 'target': 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.032 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c3681f4e-b86a-48c8-a278-9ed4f10c10c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.052 281956 DEBUG nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.053 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.053 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.054 281956 DEBUG oslo_concurrency.lockutils [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.055 281956 DEBUG nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.056 281956 WARNING nova.compute.manager [req-d20387df-5b58-44b9-9b6d-0b93c1dc7b1d req-36bd7256-6915-41cd-880b-63a5264be254 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state stopped and task_state powering-on.#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.113 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[75b9b2d7-fa30-4304-8642-05dcf7c3cbb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.115 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbcac49fc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.116 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.117 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbcac49fc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:03 localhost kernel: device tapbcac49fc-c0 entered promiscuous mode Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.124 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbcac49fc-c0, col_values=(('external_ids', {'iface-id': '98ef2da5-f5cb-44e8-a4b2-f6178c6c8332'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:03 localhost ovn_controller[154788]: 2025-11-23T09:44:03Z|00065|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.134 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.136 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e2f16c-580d-48ca-a98c-3862f2f96fe3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.139 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.141 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: global Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: log /dev/log local0 debug Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: log-tag haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33 Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: user root Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: group root Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: maxconn 1024 Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: pidfile /var/lib/neutron/external/pids/bcac49fc-c589-475a-91a8-00a0ba9c2b33.pid.haproxy Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: daemon Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: defaults Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: log global Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: mode http Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: option httplog Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: option dontlognull Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: option http-server-close Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: option forwardfor Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: retries 3 Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: timeout http-request 30s Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: timeout connect 30s Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: timeout client 32s Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: timeout server 32s Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: timeout http-keep-alive 30s Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: listen listener Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: bind 169.254.169.254:80 Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: server metadata /var/lib/neutron/metadata_proxy Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: http-request add-header X-OVN-Network-ID bcac49fc-c589-475a-91a8-00a0ba9c2b33 Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 23 04:44:03 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:03.142 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'env', 'PROCESS_TAG=haproxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bcac49fc-c589-475a-91a8-00a0ba9c2b33.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.220 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.221 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Resumed (Lifecycle Event)#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.228 281956 DEBUG nova.compute.manager [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.243 281956 INFO nova.virt.libvirt.driver [-] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Instance rebooted successfully.#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.245 281956 DEBUG nova.compute.manager [None req-f03f20ad-fab9-4aeb-8113-ce45da633d63 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.253 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.260 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.281 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.282 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.282 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] VM Started (Lifecycle Event)#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.297 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.301 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:44:03 localhost podman[285109]: Nov 23 04:44:03 localhost podman[285109]: 2025-11-23 09:44:03.595124683 +0000 UTC m=+0.077458149 container create 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 04:44:03 localhost systemd[1]: Started libpod-conmon-91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9.scope. Nov 23 04:44:03 localhost systemd[1]: Started libcrun container. Nov 23 04:44:03 localhost podman[285109]: 2025-11-23 09:44:03.561140984 +0000 UTC m=+0.043474440 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:44:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c2424dde758dbd432a1bbe6bb85652b170f03c84b9e9dce074322bc539b635/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:44:03 localhost podman[285109]: 2025-11-23 09:44:03.707616982 +0000 UTC m=+0.189950458 container init 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 04:44:03 localhost ovn_controller[154788]: 2025-11-23T09:44:03Z|00066|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.708 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:03 localhost podman[285109]: 2025-11-23 09:44:03.71638044 +0000 UTC m=+0.198713916 container start 91106d8390b152133b9b13d85685182bf6c320178fad71dacabe8d49b7cdbde9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:44:03 localhost nova_compute[281952]: 2025-11-23 09:44:03.716 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:03 localhost ovn_controller[154788]: 2025-11-23T09:44:03Z|00067|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:44:03 localhost neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285123]: [NOTICE] (285127) : New worker (285129) forked Nov 23 04:44:03 localhost neutron-haproxy-ovnmeta-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285123]: [NOTICE] (285127) : Loading success. Nov 23 04:44:03 localhost snmpd[67457]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Nov 23 04:44:04 localhost ovn_controller[154788]: 2025-11-23T09:44:04Z|00068|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:44:04 localhost nova_compute[281952]: 2025-11-23 09:44:04.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.160 281956 DEBUG nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.161 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.161 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.162 281956 DEBUG oslo_concurrency.lockutils [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.162 281956 DEBUG nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] No waiting events found dispatching network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:44:05 localhost nova_compute[281952]: 2025-11-23 09:44:05.163 281956 WARNING nova.compute.manager [req-cca5709a-6979-4718-9ebf-b13d0dd40bd1 req-e470480f-3aa7-4bd8-aafc-d4800f7422f5 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Received unexpected event network-vif-plugged-d3912d14-a3e0-4df9-b811-f3bd90f44559 for instance with vm_state active and task_state None.#033[00m Nov 23 04:44:07 localhost nova_compute[281952]: 2025-11-23 09:44:07.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:07 localhost nova_compute[281952]: 2025-11-23 09:44:07.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:44:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:44:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:44:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:44:10 localhost systemd[1]: tmp-crun.Djnpsp.mount: Deactivated successfully. Nov 23 04:44:10 localhost podman[285138]: 2025-11-23 09:44:10.067814036 +0000 UTC m=+0.116531891 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 23 04:44:10 localhost podman[285138]: 2025-11-23 09:44:10.083216423 +0000 UTC m=+0.131934268 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:44:10 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:44:11 localhost podman[240668]: time="2025-11-23T09:44:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:44:11 localhost podman[240668]: @ - - [23/Nov/2025:09:44:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:44:11 localhost podman[240668]: @ - - [23/Nov/2025:09:44:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1" Nov 23 04:44:12 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:72:a3:51 MACPROTO=0800 SRC=89.248.163.200 DST=38.102.83.198 LEN=40 TOS=0x08 PREC=0x40 TTL=245 ID=57327 PROTO=TCP SPT=48651 DPT=9090 SEQ=642428843 ACK=0 WINDOW=1024 RES=0x00 SYN URGP=0 Nov 23 04:44:12 localhost nova_compute[281952]: 2025-11-23 09:44:12.680 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:12 localhost nova_compute[281952]: 2025-11-23 09:44:12.995 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:44:16 localhost systemd[1]: tmp-crun.fp7U7E.mount: Deactivated successfully. Nov 23 04:44:16 localhost podman[285155]: 2025-11-23 09:44:16.047669772 +0000 UTC m=+0.097963826 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:44:16 localhost podman[285155]: 2025-11-23 09:44:16.05857975 +0000 UTC m=+0.108873764 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:44:16 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:44:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36853 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF21B0000000001030307) Nov 23 04:44:17 localhost ovn_controller[154788]: 2025-11-23T09:44:17Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:aa:3b 192.168.0.77 Nov 23 04:44:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36854 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF6200000000001030307) Nov 23 04:44:17 localhost nova_compute[281952]: 2025-11-23 09:44:17.708 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCF8200000000001030307) Nov 23 04:44:17 localhost nova_compute[281952]: 2025-11-23 09:44:17.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36855 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DCFE200000000001030307) Nov 23 04:44:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61221 DF PROTO=TCP SPT=52974 DPT=9102 SEQ=891157253 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD02200000000001030307) Nov 23 04:44:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:44:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:44:21 localhost podman[285179]: 2025-11-23 09:44:21.027221626 +0000 UTC m=+0.082932779 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:44:21 localhost podman[285179]: 2025-11-23 09:44:21.090153875 +0000 UTC m=+0.145865028 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:44:21 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:44:21 localhost podman[285180]: 2025-11-23 09:44:21.096004787 +0000 UTC m=+0.149527063 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 04:44:21 localhost podman[285180]: 2025-11-23 09:44:21.176405956 +0000 UTC m=+0.229928212 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9) Nov 23 04:44:21 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:22.695 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:22.698 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:22 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:22 localhost nova_compute[281952]: 2025-11-23 09:44:22.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:44:23 localhost nova_compute[281952]: 2025-11-23 09:44:23.001 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:23 localhost podman[285223]: 2025-11-23 09:44:23.036911689 +0000 UTC m=+0.086961784 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Nov 23 04:44:23 localhost podman[285223]: 2025-11-23 09:44:23.041633675 +0000 UTC m=+0.091683720 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:44:23 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:44:23 localhost nova_compute[281952]: 2025-11-23 09:44:23.157 281956 DEBUG nova.compute.manager [None req-66e56ac4-2fc6-426f-bfb2-48c090186a1a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:44:23 localhost nova_compute[281952]: 2025-11-23 09:44:23.163 281956 INFO nova.compute.manager [None req-66e56ac4-2fc6-426f-bfb2-48c090186a1a 7e40ee99e6034be7be796ae12095c154 1915d3e5d4254231a0517e2dcf35848f - - default default] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Retrieving diagnostics#033[00m Nov 23 04:44:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36856 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD0DE00000000001030307) Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.007 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.007 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.3097534#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42604 [23/Nov/2025:09:44:22.694] listener listener/metadata 0/0/0/1313/1313 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.026 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.027 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42608 [23/Nov/2025:09:44:24.025] listener listener/metadata 0/0/0/32/32 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.058 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0310731#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.072 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.074 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.096 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.097 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0235972#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42612 [23/Nov/2025:09:44:24.072] listener listener/metadata 0/0/0/25/25 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.104 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.105 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.123 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42620 [23/Nov/2025:09:44:24.104] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.123 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0181267#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.131 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.132 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.150 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42622 [23/Nov/2025:09:44:24.130] listener listener/metadata 0/0/0/20/20 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.151 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0192010#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.158 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.159 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.179 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42626 [23/Nov/2025:09:44:24.158] listener listener/metadata 0/0/0/22/22 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.180 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 148 time: 0.0209279#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.187 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.188 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.203 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42634 [23/Nov/2025:09:44:24.187] listener listener/metadata 0/0/0/17/17 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.204 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0159786#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.211 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.212 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.225 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.226 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0140369#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42644 [23/Nov/2025:09:44:24.210] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.232 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.233 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.250 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42648 [23/Nov/2025:09:44:24.232] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.250 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0171552#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.256 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.257 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42658 [23/Nov/2025:09:44:24.255] listener listener/metadata 0/0/0/19/19 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.275 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0183260#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.292 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.293 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.307 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42664 [23/Nov/2025:09:44:24.292] listener listener/metadata 0/0/0/16/16 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.308 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0153022#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.312 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.313 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.332 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42678 [23/Nov/2025:09:44:24.311] listener listener/metadata 0/0/0/20/20 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.332 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0197785#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.337 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.338 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.353 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42688 [23/Nov/2025:09:44:24.337] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.353 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0151033#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.358 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.359 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.373 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42694 [23/Nov/2025:09:44:24.358] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.373 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0145631#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.380 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.380 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.397 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42702 [23/Nov/2025:09:44:24.379] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.398 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0175052#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.404 160537 DEBUG eventlet.wsgi.server [-] (160537) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.405 160537 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Accept: */*#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Connection: close#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Content-Type: text/plain#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: Host: 169.254.169.254#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: User-Agent: curl/7.84.0#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Forwarded-For: 192.168.0.77#015 Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: X-Ovn-Network-Id: bcac49fc-c589-475a-91a8-00a0ba9c2b33 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.424 160537 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 23 04:44:24 localhost haproxy-metadata-proxy-bcac49fc-c589-475a-91a8-00a0ba9c2b33[285129]: 192.168.0.77:42710 [23/Nov/2025:09:44:24.404] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 23 04:44:24 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:24.425 160537 INFO eventlet.wsgi.server [-] 192.168.0.77, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0197296#033[00m Nov 23 04:44:26 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:26.724 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:44:26 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:26.725 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:44:26 localhost nova_compute[281952]: 2025-11-23 09:44:26.753 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:44:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:44:27 localhost podman[285242]: 2025-11-23 09:44:27.034069387 +0000 UTC m=+0.087213272 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true) Nov 23 04:44:27 localhost podman[285243]: 2025-11-23 09:44:27.086994976 +0000 UTC m=+0.136377424 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:44:27 localhost podman[285243]: 2025-11-23 09:44:27.099487033 +0000 UTC m=+0.148869471 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:44:27 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:44:27 localhost podman[285242]: 2025-11-23 09:44:27.150934656 +0000 UTC m=+0.204078581 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:44:27 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:44:27 localhost nova_compute[281952]: 2025-11-23 09:44:27.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:28 localhost nova_compute[281952]: 2025-11-23 09:44:28.030 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:29 localhost openstack_network_exporter[242668]: ERROR 09:44:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:44:29 localhost openstack_network_exporter[242668]: ERROR 09:44:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:44:29 localhost openstack_network_exporter[242668]: ERROR 09:44:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:44:29 localhost openstack_network_exporter[242668]: ERROR 09:44:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:44:29 localhost openstack_network_exporter[242668]: Nov 23 04:44:29 localhost openstack_network_exporter[242668]: ERROR 09:44:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:44:29 localhost openstack_network_exporter[242668]: Nov 23 04:44:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36857 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD2E200000000001030307) Nov 23 04:44:31 localhost ovn_metadata_agent[160434]: 2025-11-23 09:44:31.728 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:44:32 localhost nova_compute[281952]: 2025-11-23 09:44:32.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:32 localhost ovn_controller[154788]: 2025-11-23T09:44:32Z|00069|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 23 04:44:33 localhost nova_compute[281952]: 2025-11-23 09:44:33.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:37 localhost nova_compute[281952]: 2025-11-23 09:44:37.825 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:38 localhost nova_compute[281952]: 2025-11-23 09:44:38.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:39 localhost sshd[285371]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:44:41 localhost podman[285372]: 2025-11-23 09:44:41.049153493 +0000 UTC m=+0.095418846 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:44:41 localhost podman[285372]: 2025-11-23 09:44:41.084628382 +0000 UTC m=+0.130893685 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:44:41 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:44:41 localhost podman[240668]: time="2025-11-23T09:44:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:44:41 localhost podman[240668]: @ - - [23/Nov/2025:09:44:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:44:41 localhost podman[240668]: @ - - [23/Nov/2025:09:44:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Nov 23 04:44:42 localhost nova_compute[281952]: 2025-11-23 09:44:42.865 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:43 localhost nova_compute[281952]: 2025-11-23 09:44:43.037 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44461 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD674B0000000001030307) Nov 23 04:44:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:44:47 localhost sshd[285407]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:44:47 localhost podman[285391]: 2025-11-23 09:44:47.031704911 +0000 UTC m=+0.084748136 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:44:47 localhost podman[285391]: 2025-11-23 09:44:47.040440331 +0000 UTC m=+0.093483516 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:44:47 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:44:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44462 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD6B600000000001030307) Nov 23 04:44:47 localhost nova_compute[281952]: 2025-11-23 09:44:47.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:48 localhost nova_compute[281952]: 2025-11-23 09:44:48.041 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36858 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD6E200000000001030307) Nov 23 04:44:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44463 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD73600000000001030307) Nov 23 04:44:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23208 DF PROTO=TCP SPT=54030 DPT=9102 SEQ=1871829459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD76200000000001030307) Nov 23 04:44:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:44:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:44:52 localhost systemd[1]: tmp-crun.TN81PO.mount: Deactivated successfully. Nov 23 04:44:52 localhost podman[285415]: 2025-11-23 09:44:52.038307363 +0000 UTC m=+0.090611297 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:44:52 localhost podman[285416]: 2025-11-23 09:44:52.118341512 +0000 UTC m=+0.165500206 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter) Nov 23 04:44:52 localhost podman[285415]: 2025-11-23 09:44:52.127783945 +0000 UTC m=+0.180087899 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 04:44:52 localhost podman[285416]: 2025-11-23 09:44:52.139364223 +0000 UTC m=+0.186522977 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 04:44:52 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:44:52 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:44:52 localhost nova_compute[281952]: 2025-11-23 09:44:52.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:53 localhost nova_compute[281952]: 2025-11-23 09:44:53.043 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44464 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DD83200000000001030307) Nov 23 04:44:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:44:54 localhost podman[285458]: 2025-11-23 09:44:54.031990181 +0000 UTC m=+0.083723134 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:44:54 localhost podman[285458]: 2025-11-23 09:44:54.036471269 +0000 UTC m=+0.088204252 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:44:54 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:44:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:44:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:44:57 localhost nova_compute[281952]: 2025-11-23 09:44:57.950 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:44:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:44:58 localhost nova_compute[281952]: 2025-11-23 09:44:58.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:44:58 localhost podman[285477]: 2025-11-23 09:44:58.0585496 +0000 UTC m=+0.090766702 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:44:58 localhost podman[285477]: 2025-11-23 09:44:58.070535681 +0000 UTC m=+0.102752813 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:44:58 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:44:58 localhost podman[285478]: 2025-11-23 09:44:58.153656696 +0000 UTC m=+0.181428681 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:44:58 localhost podman[285478]: 2025-11-23 09:44:58.161829439 +0000 UTC m=+0.189601404 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:44:58 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:44:59 localhost openstack_network_exporter[242668]: ERROR 09:44:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:44:59 localhost openstack_network_exporter[242668]: ERROR 09:44:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:44:59 localhost openstack_network_exporter[242668]: ERROR 09:44:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:44:59 localhost openstack_network_exporter[242668]: ERROR 09:44:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:44:59 localhost openstack_network_exporter[242668]: Nov 23 04:44:59 localhost openstack_network_exporter[242668]: ERROR 09:44:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:44:59 localhost openstack_network_exporter[242668]: Nov 23 04:45:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44465 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDA4200000000001030307) Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.456 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.457 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.841 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.842 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.842 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.843 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:45:02 localhost nova_compute[281952]: 2025-11-23 09:45:02.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.364 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.383 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.383 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.384 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.385 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.386 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.387 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.409 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.410 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.410 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.411 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.411 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.876 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.947 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:45:03 localhost nova_compute[281952]: 2025-11-23 09:45:03.948 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.110 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12293MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.111 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.223 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.275 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.736 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.744 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.772 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.812 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:45:04 localhost nova_compute[281952]: 2025-11-23 09:45:04.813 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:45:08 localhost nova_compute[281952]: 2025-11-23 09:45:08.030 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:08 localhost nova_compute[281952]: 2025-11-23 09:45:08.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:45:09.280 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:45:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:45:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:45:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:45:09.282 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.804 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.832 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 327680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.833 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4871c7c-50eb-4c4c-be0f-6bfac2029e1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 327680, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.806022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a577bac-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'dd482b2c9a80b9016108197aeb6a2b7715f5728866168c3c71bc7dd462bc3bbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.806022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a57970e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '528b90a3aeb0f09ef62fb5aac2a52fba5f8f1512480bbd0dc18ac9c13e5b400c'}]}, 'timestamp': '2025-11-23 09:45:10.834182', '_unique_id': '65ec9eaaf2f049bf951f05392df43bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.835 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.837 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.859 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ec032e4-2736-404a-a661-571dfcb99c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:45:10.837474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1a5b7c98-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.036686784, 'message_signature': '5df2bd9226d2a7e8dfa19fc5bd8c17b21ab1ad7568a801c6c8a51ff7977e7c8e'}]}, 'timestamp': '2025-11-23 09:45:10.859740', '_unique_id': '0ef07482614449e38546e7c23fe5b945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1287506-c58d-4074-a375-222647e095d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.862254', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a5c6a7c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '0c574ec46ea776c379d32928c9ff9bde0c4765dc4ce5c1b6c2620a43fda1cbfa'}]}, 'timestamp': '2025-11-23 09:45:10.865813', '_unique_id': '28196d48e5c24b40ab18dfa03cf70789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.867 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f76681f8-15d7-4e8d-976e-db150cc21209', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.868094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5e8960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '4c83a78632da016e93f51277e01aaf9e3c474750f48f617b9da0e21e8f2ca054'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.868094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5e9e50-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '24cb8d35065e72678870fde0b515b7c548b26b3718043599d94a1d86ac7e9d94'}]}, 'timestamp': '2025-11-23 09:45:10.880431', '_unique_id': 'fad264f531ba48f28b07f5042a0e0e10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bea516e-70a7-4893-a45a-64206a73f479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.882857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5f1b3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c157ef704fdf06bbb33cbb66615d5e4511163a51a9ecb9e1377fa0837d6113a1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.882857', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5f2e9c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c9b06920a7cd1123b1a2a0f1334d307ad9ff4f6598a7763c3f5a26a85310b37d'}]}, 'timestamp': '2025-11-23 09:45:10.883957', '_unique_id': '9c5c319d3e294e97b02b1ebeee5612be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f624aedd-674a-4926-a4ee-7fe4bff047b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.886300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a5f9ce2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': '25dbf1e1d678c9bfec0cb1bd811809477565e26cc0f55f2d2ad5054bd58303cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.886300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a5fadea-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.045718524, 'message_signature': 'c7774c1516c1aa5c8f81dc2d39e2834d32e7ae7b3f87d40f7fd1b7c62990ebee'}]}, 'timestamp': '2025-11-23 09:45:10.887165', '_unique_id': 'e4a94aaf372341ef8fccfc6c95d93c72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd58534dd-deb8-40a2-99d3-f5536007f3bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.889295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a6012da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2382eece137fbe9521b2ec6f51b3c6b53c82fb4c75addeeb5c03f07dca2cc302'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.889295', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a60240a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2ca35d18d5ffc4129347a713f1f90166a01b2193fe6a6fbbfc9995f88f72fb22'}]}, 'timestamp': '2025-11-23 09:45:10.890193', '_unique_id': '4831ffafb50442a5a98399c72a6ee99a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb5a7104-a2ca-46bb-80d8-670567bda1dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.892297', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a6086fc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'd2a7e1b90ba3c2800a5ae350622d70cb2db0b23c4bd6f1a20abfde6f1aa1ccf7'}]}, 'timestamp': '2025-11-23 09:45:10.892751', '_unique_id': '34aeb942c9114131bfa9a7fed4f5a221'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 11650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a6b5c5b-aca4-466e-9d37-af948fe859f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11650000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:45:10.894843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1a60eb56-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.036686784, 'message_signature': '0e6846a1045bc1e1a8f6974e7defd811ad874ed3b68daefacdba0bbb6a0781a0'}]}, 'timestamp': '2025-11-23 09:45:10.895301', '_unique_id': '6741cb3511ae4d699728c69c5a2057d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbcf95b5-82c1-4008-8a8c-8bee0e6dd711', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.897412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a614e5c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'dc22cd0829d05ab9fab22269b5d7ca044198ee44e9e04681cded026b5b071c48'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.897412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a615f14-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '9f87002a9a21bd5dfd2af6e0a1eefb1e7de2cb25f1f15df4f8dd2a45cbb990c1'}]}, 'timestamp': '2025-11-23 09:45:10.898249', '_unique_id': '23db7d68b2dd4fab90bc48c3cb55fdd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1112cd82-d488-4ba6-b473-8165056feffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.900330', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a61c062-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'c0995ffe9c937d8612e0cf95251f449e1958142fe326dddaf7f21376d73da26b'}]}, 'timestamp': '2025-11-23 09:45:10.900765', '_unique_id': 'aa4ef75d9a8d4300a417f6fe53245af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2acd2947-5dfb-4d8a-a89e-88d03461753a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.902963', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a622782-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '99fc1663dc105c1f380fe31d68c7d61bee8df41ec8a2fe391bf9da917d62ac2c'}]}, 'timestamp': '2025-11-23 09:45:10.903410', '_unique_id': '95cab1b6ca82441d9a5f93a5eebe00d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 36 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb877ac6-849a-4e73-a458-ab451260f7b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 36, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.905461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a6288b2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'baf5cf10d4c7fc3f6da6b0c8a7e8fb0bc1c8101c1f53548ca05a15c9ac4271a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.905461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a629960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '4ee382b31bf1372ff0f96916e715374082f06fbce1f51fd262d4d3100c686ddd'}]}, 'timestamp': '2025-11-23 09:45:10.906333', '_unique_id': '4f6b8e38162946dabf87eaa7968fc360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bb6578b-d7e6-46d4-b439-8607defed2fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.908773', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a630d78-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'e7b9f5a79908938ae98e86d2054866ea0e2a484d0bba91b477a139d6abe0f5d2'}]}, 'timestamp': '2025-11-23 09:45:10.909369', '_unique_id': '06303a20846746bbb389bfd70b8a0363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e09aa88-c596-41c8-9a58-e7619d436061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.911544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a637664-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': 'ddc99b179786aebef0e14c533d18cab531e1150132a1f217f4ad392f68ca2b85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.911544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a63871c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '677cef864b73832a1db8de8ee5e831cf8d5b2b41c4fd3ec01cbf96685a662051'}]}, 'timestamp': '2025-11-23 09:45:10.912380', '_unique_id': 'dfe46d621799418a9fe7850deccc1ff2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 950875641 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a740f129-5510-44b4-b42f-e9bc2dcd01b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 950875641, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:45:10.914603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1a63edc4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '183bfd62c4918910fb44e8924f3c177c7c23c6beb9d5354ad5e540c33b1c87d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:45:10.914603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1a63fe68-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11053.983644751, 'message_signature': '2ccac08766caebc06699a756186a1b612f39d3186fd5c60b922d539933a0b062'}]}, 'timestamp': '2025-11-23 09:45:10.915433', '_unique_id': '0127799b9a8041a88923d1e082f290dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc717cf6-d5d6-4d68-9289-f419a73c957c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.917907', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a646cf4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '631a03efd47083efb7805d6d56359d7d26ca80226c22033ddcd23aee7e952fbb'}]}, 'timestamp': '2025-11-23 09:45:10.918222', '_unique_id': '5bf14446c50c462d91b4dfcddd6a2910'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fbd3a55-71f3-430d-b7da-bda4e09e31ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.919607', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a64b6dc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '0649a0306addb19b77c319338e06e341fb5a4ca3c1c0b5a6eedef1953300245b'}]}, 'timestamp': '2025-11-23 09:45:10.920113', '_unique_id': '2ffafb4dfd514d849250b01d1d2eff7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c54c07a7-4eb4-4c48-83df-d6a8ae359172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.921968', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a650b1e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '04b6e8a2792fea08522442d1721c3686fb8a24bbc51c4b4f01a06395b3bb4204'}]}, 'timestamp': '2025-11-23 09:45:10.922269', '_unique_id': '7d34fdfbfad044058919db3b3b909f85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '432e8f63-9537-4c4a-8375-946e363d5aec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.923620', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a654cb4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': '4607573db8cca7788b48b414d0b6b2413bea2bc47373193c65b0882c17543c6d'}]}, 'timestamp': '2025-11-23 09:45:10.923970', '_unique_id': '2e1b17fb6a2f437aac31406e774e1c1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5a1cfd-0e97-43cb-9111-e66d275d66f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:45:10.925618', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '1a659994-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11054.039884523, 'message_signature': 'a903ff11750fe9b9e90bd2c06d8d3587a392fa3b923ffba0eb0246035f392f4d'}]}, 'timestamp': '2025-11-23 09:45:10.925937', '_unique_id': 'cf7455c4264444ca9430a1961dd6b439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:45:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:45:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:45:11 localhost podman[240668]: time="2025-11-23T09:45:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:45:11 localhost podman[240668]: @ - - [23/Nov/2025:09:45:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:45:11 localhost podman[240668]: @ - - [23/Nov/2025:09:45:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17235 "" "Go-http-client/1.1" Nov 23 04:45:12 localhost systemd[1]: tmp-crun.TbRK9q.mount: Deactivated successfully. Nov 23 04:45:12 localhost podman[285564]: 2025-11-23 09:45:12.034019052 +0000 UTC m=+0.087426328 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Nov 23 04:45:12 localhost podman[285564]: 2025-11-23 09:45:12.048395437 +0000 UTC m=+0.101802753 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:45:12 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.056 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.072 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:13 localhost nova_compute[281952]: 2025-11-23 09:45:13.072 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59928 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDDC7B0000000001030307) Nov 23 04:45:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59929 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE0A10000000001030307) Nov 23 04:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:45:18 localhost podman[285583]: 2025-11-23 09:45:18.034072802 +0000 UTC m=+0.078943977 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:45:18 localhost podman[285583]: 2025-11-23 09:45:18.042220925 +0000 UTC m=+0.087092110 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:45:18 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.073 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.076 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:18 localhost nova_compute[281952]: 2025-11-23 09:45:18.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44466 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE4200000000001030307) Nov 23 04:45:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59930 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDE8A00000000001030307) Nov 23 04:45:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36859 DF PROTO=TCP SPT=44982 DPT=9102 SEQ=3049309574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDEC210000000001030307) Nov 23 04:45:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:45:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:45:23 localhost systemd[1]: tmp-crun.XyNBx3.mount: Deactivated successfully. Nov 23 04:45:23 localhost podman[285608]: 2025-11-23 09:45:23.022350217 +0000 UTC m=+0.077695467 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:45:23 localhost podman[285608]: 2025-11-23 09:45:23.101304492 +0000 UTC m=+0.156649802 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:45:23 localhost nova_compute[281952]: 2025-11-23 09:45:23.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:23 localhost nova_compute[281952]: 2025-11-23 09:45:23.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:23 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:45:23 localhost podman[285609]: 2025-11-23 09:45:23.107153764 +0000 UTC m=+0.158968905 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter) Nov 23 04:45:23 localhost podman[285609]: 2025-11-23 09:45:23.190449243 +0000 UTC m=+0.242264394 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible) Nov 23 04:45:23 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:45:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59931 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DDF8600000000001030307) Nov 23 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:45:25 localhost podman[285653]: 2025-11-23 09:45:25.013917619 +0000 UTC m=+0.070694551 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 23 04:45:25 localhost podman[285653]: 2025-11-23 09:45:25.043801014 +0000 UTC m=+0.100577986 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:45:25 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:28 localhost nova_compute[281952]: 2025-11-23 09:45:28.152 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:45:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:45:29 localhost systemd[1]: tmp-crun.ZW2OIM.mount: Deactivated successfully. Nov 23 04:45:29 localhost podman[285672]: 2025-11-23 09:45:29.035653138 +0000 UTC m=+0.089664118 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd) Nov 23 04:45:29 localhost podman[285672]: 2025-11-23 09:45:29.054476332 +0000 UTC m=+0.108487312 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd) Nov 23 04:45:29 localhost systemd[1]: tmp-crun.G8T98a.mount: Deactivated successfully. Nov 23 04:45:29 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:45:29 localhost podman[285673]: 2025-11-23 09:45:29.072098027 +0000 UTC m=+0.124132416 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:45:29 localhost podman[285673]: 2025-11-23 09:45:29.079500046 +0000 UTC m=+0.131534445 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:45:29 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:45:29 localhost openstack_network_exporter[242668]: ERROR 09:45:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:45:29 localhost openstack_network_exporter[242668]: ERROR 09:45:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:45:29 localhost openstack_network_exporter[242668]: ERROR 09:45:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:45:29 localhost openstack_network_exporter[242668]: ERROR 09:45:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:45:29 localhost openstack_network_exporter[242668]: Nov 23 04:45:29 localhost openstack_network_exporter[242668]: ERROR 09:45:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:45:29 localhost openstack_network_exporter[242668]: Nov 23 04:45:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59932 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE18200000000001030307) Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.197 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:33 localhost nova_compute[281952]: 2025-11-23 09:45:33.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.200 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.240 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:38 localhost nova_compute[281952]: 2025-11-23 09:45:38.241 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:39 localhost sshd[285799]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:45:41 localhost podman[240668]: time="2025-11-23T09:45:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:45:41 localhost podman[240668]: @ - - [23/Nov/2025:09:45:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:45:41 localhost podman[240668]: @ - - [23/Nov/2025:09:45:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1" Nov 23 04:45:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:45:43 localhost podman[285801]: 2025-11-23 09:45:43.018026882 +0000 UTC m=+0.070996080 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118) Nov 23 04:45:43 localhost podman[285801]: 2025-11-23 09:45:43.052332294 +0000 UTC m=+0.105301542 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:45:43 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.242 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.290 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:43 localhost nova_compute[281952]: 2025-11-23 09:45:43.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63390 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE51AB0000000001030307) Nov 23 04:45:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63391 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE55A10000000001030307) Nov 23 04:45:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59933 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE58200000000001030307) Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.313 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5024 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.316 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:48 localhost nova_compute[281952]: 2025-11-23 09:45:48.319 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:45:48 localhost sshd[285820]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:45:49 localhost systemd[1]: tmp-crun.tsWFYc.mount: Deactivated successfully. Nov 23 04:45:49 localhost podman[285821]: 2025-11-23 09:45:49.039847786 +0000 UTC m=+0.089084440 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:45:49 localhost podman[285821]: 2025-11-23 09:45:49.050511175 +0000 UTC m=+0.099747779 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:45:49 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:45:49 localhost systemd-logind[761]: New session 61 of user zuul. Nov 23 04:45:49 localhost systemd[1]: Started Session 61 of User zuul. Nov 23 04:45:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63392 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE5DA00000000001030307) Nov 23 04:45:49 localhost python3[285866]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:45:49 localhost subscription-manager[285867]: Unregistered machine with identity: 805f8986-28c6-49b9-9c1d-56700caa6ca2 Nov 23 04:45:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44467 DF PROTO=TCP SPT=42310 DPT=9102 SEQ=3190978265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE62200000000001030307) Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.324 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.359 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:53 localhost nova_compute[281952]: 2025-11-23 09:45:53.360 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63393 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE6D600000000001030307) Nov 23 04:45:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:45:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:45:54 localhost podman[285870]: 2025-11-23 09:45:54.040405871 +0000 UTC m=+0.091243167 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64) Nov 23 04:45:54 localhost podman[285870]: 2025-11-23 09:45:54.051942028 +0000 UTC m=+0.102779364 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:45:54 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:45:54 localhost podman[285869]: 2025-11-23 09:45:54.017105719 +0000 UTC m=+0.072028681 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 23 04:45:54 localhost podman[285869]: 2025-11-23 09:45:54.09785327 +0000 UTC m=+0.152776252 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:45:54 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:45:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:45:56 localhost podman[285912]: 2025-11-23 09:45:56.03576616 +0000 UTC m=+0.087058728 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Nov 23 04:45:56 localhost podman[285912]: 2025-11-23 09:45:56.041436575 +0000 UTC m=+0.092729203 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:45:56 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.361 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.362 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.363 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.363 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:45:58 localhost nova_compute[281952]: 2025-11-23 09:45:58.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:45:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:45:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:45:59 localhost openstack_network_exporter[242668]: ERROR 09:45:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:45:59 localhost openstack_network_exporter[242668]: ERROR 09:45:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:45:59 localhost openstack_network_exporter[242668]: ERROR 09:45:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:45:59 localhost openstack_network_exporter[242668]: ERROR 09:45:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:45:59 localhost openstack_network_exporter[242668]: Nov 23 04:45:59 localhost openstack_network_exporter[242668]: ERROR 09:45:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:45:59 localhost openstack_network_exporter[242668]: Nov 23 04:46:00 localhost podman[285930]: 2025-11-23 09:46:00.027884393 +0000 UTC m=+0.083616351 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:46:00 localhost systemd[1]: tmp-crun.1t92JS.mount: Deactivated successfully. Nov 23 04:46:00 localhost podman[285931]: 2025-11-23 09:46:00.0487949 +0000 UTC m=+0.099843003 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:46:00 localhost podman[285931]: 2025-11-23 09:46:00.083948529 +0000 UTC m=+0.134996712 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:46:00 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:46:00 localhost podman[285930]: 2025-11-23 09:46:00.100563474 +0000 UTC m=+0.156295462 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3) Nov 23 04:46:00 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:46:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63394 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DE8E210000000001030307) Nov 23 04:46:02 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.396 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.398 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.398 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.422 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:03 localhost nova_compute[281952]: 2025-11-23 09:46:03.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.815 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.816 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.817 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.817 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.962 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:46:04 localhost nova_compute[281952]: 2025-11-23 09:46:04.962 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.464 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.477 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.477 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.478 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.479 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.479 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.480 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.481 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.481 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.482 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.502 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.502 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.503 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.503 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.504 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:46:05 localhost nova_compute[281952]: 2025-11-23 09:46:05.991 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.054 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.055 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.277 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.279 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12291MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.279 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.376 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.376 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.377 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.476 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.942 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.950 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.972 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.975 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:46:06 localhost nova_compute[281952]: 2025-11-23 09:46:06.975 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:08 localhost nova_compute[281952]: 2025-11-23 09:46:08.460 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:46:09.281 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:46:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:46:09.282 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:46:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:46:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:46:11 localhost podman[240668]: time="2025-11-23T09:46:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:46:11 localhost podman[240668]: @ - - [23/Nov/2025:09:46:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:46:11 localhost podman[240668]: @ - - [23/Nov/2025:09:46:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17237 "" "Go-http-client/1.1" Nov 23 04:46:13 localhost nova_compute[281952]: 2025-11-23 09:46:13.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:46:14 localhost podman[286019]: 2025-11-23 09:46:14.029690901 +0000 UTC m=+0.086335725 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:46:14 localhost podman[286019]: 2025-11-23 09:46:14.041411254 +0000 UTC m=+0.098056108 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:46:14 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:46:16 localhost sshd[286038]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:46:16 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18246 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DEC6DB0000000001030307) Nov 23 04:46:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18247 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DECAE00000000001030307) Nov 23 04:46:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63395 DF PROTO=TCP SPT=55650 DPT=9102 SEQ=673330066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DECE200000000001030307) Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.505 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:18 localhost nova_compute[281952]: 2025-11-23 09:46:18.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:46:19 localhost systemd[1]: tmp-crun.YPLUEv.mount: Deactivated successfully. Nov 23 04:46:19 localhost podman[286057]: 2025-11-23 09:46:19.275947545 +0000 UTC m=+0.082788700 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:46:19 localhost podman[286057]: 2025-11-23 09:46:19.310550407 +0000 UTC m=+0.117391572 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:46:19 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:46:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18248 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DED2E00000000001030307) Nov 23 04:46:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59934 DF PROTO=TCP SPT=32880 DPT=9102 SEQ=1257914380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DED6200000000001030307) Nov 23 04:46:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:60:fc:17 MACDST=fa:16:3e:15:fb:34 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18249 DF PROTO=TCP SPT=40056 DPT=9102 SEQ=95169938 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AD2DEE2A00000000001030307) Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.513 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:23 localhost nova_compute[281952]: 2025-11-23 09:46:23.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:46:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:46:25 localhost systemd[1]: tmp-crun.9JYCKb.mount: Deactivated successfully. Nov 23 04:46:25 localhost podman[286175]: 2025-11-23 09:46:25.05576343 +0000 UTC m=+0.106566594 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:46:25 localhost podman[286176]: 2025-11-23 09:46:25.022671765 +0000 UTC m=+0.074328845 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git) Nov 23 04:46:25 localhost podman[286176]: 2025-11-23 09:46:25.117423288 +0000 UTC m=+0.169080388 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.) Nov 23 04:46:25 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:46:25 localhost podman[286175]: 2025-11-23 09:46:25.157374057 +0000 UTC m=+0.208177211 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 04:46:25 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:46:25 localhost sshd[286219]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:46:25 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 23 04:46:25 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 23 04:46:25 localhost systemd-logind[761]: New session 62 of user tripleo-admin. Nov 23 04:46:25 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 23 04:46:25 localhost systemd[1]: Starting User Manager for UID 1003... Nov 23 04:46:26 localhost systemd[286223]: Queued start job for default target Main User Target. Nov 23 04:46:26 localhost systemd[286223]: Created slice User Application Slice. Nov 23 04:46:26 localhost systemd[286223]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 04:46:26 localhost systemd-journald[48157]: Field hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 23 04:46:26 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 04:46:26 localhost systemd[286223]: Started Daily Cleanup of User's Temporary Directories. Nov 23 04:46:26 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:46:26 localhost systemd[286223]: Reached target Paths. Nov 23 04:46:26 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:46:26 localhost systemd[286223]: Reached target Timers. Nov 23 04:46:26 localhost systemd[286223]: Starting D-Bus User Message Bus Socket... Nov 23 04:46:26 localhost systemd[286223]: Starting Create User's Volatile Files and Directories... Nov 23 04:46:26 localhost systemd[286223]: Listening on D-Bus User Message Bus Socket. Nov 23 04:46:26 localhost systemd[286223]: Finished Create User's Volatile Files and Directories. Nov 23 04:46:26 localhost systemd[286223]: Reached target Sockets. Nov 23 04:46:26 localhost systemd[286223]: Reached target Basic System. Nov 23 04:46:26 localhost systemd[286223]: Reached target Main User Target. Nov 23 04:46:26 localhost systemd[286223]: Startup finished in 164ms. Nov 23 04:46:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:46:26 localhost systemd[1]: Started User Manager for UID 1003. Nov 23 04:46:26 localhost systemd[1]: Started Session 62 of User tripleo-admin. Nov 23 04:46:26 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 04:46:26 localhost podman[286239]: 2025-11-23 09:46:26.28218024 +0000 UTC m=+0.076050569 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Nov 23 04:46:26 localhost podman[286239]: 2025-11-23 09:46:26.31450501 +0000 UTC m=+0.108375309 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:46:26 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:46:26 localhost python3[286384]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:46:27 localhost python3[286528]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 23 04:46:27 localhost systemd[1]: Stopping Netfilter Tables... Nov 23 04:46:27 localhost systemd[1]: nftables.service: Deactivated successfully. Nov 23 04:46:27 localhost systemd[1]: Stopped Netfilter Tables. Nov 23 04:46:27 localhost systemd[1]: Starting Netfilter Tables... Nov 23 04:46:28 localhost systemd[1]: Finished Netfilter Tables. Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.554 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.597 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:28 localhost nova_compute[281952]: 2025-11-23 09:46:28.598 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:29 localhost openstack_network_exporter[242668]: ERROR 09:46:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:46:29 localhost openstack_network_exporter[242668]: ERROR 09:46:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:46:29 localhost openstack_network_exporter[242668]: ERROR 09:46:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:46:29 localhost openstack_network_exporter[242668]: ERROR 09:46:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:46:29 localhost openstack_network_exporter[242668]: Nov 23 04:46:29 localhost openstack_network_exporter[242668]: ERROR 09:46:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:46:29 localhost openstack_network_exporter[242668]: Nov 23 04:46:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:46:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:46:31 localhost systemd[1]: tmp-crun.T3WxNd.mount: Deactivated successfully. Nov 23 04:46:31 localhost podman[286553]: 2025-11-23 09:46:31.024595925 +0000 UTC m=+0.071019202 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:46:31 localhost systemd[1]: tmp-crun.dMw8XN.mount: Deactivated successfully. Nov 23 04:46:31 localhost podman[286553]: 2025-11-23 09:46:31.042292889 +0000 UTC m=+0.088716166 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 04:46:31 localhost podman[286554]: 2025-11-23 09:46:31.045727356 +0000 UTC m=+0.087769296 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:46:31 localhost podman[286554]: 2025-11-23 09:46:31.057327468 +0000 UTC m=+0.099369408 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:46:31 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:46:31 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.628 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:33 localhost nova_compute[281952]: 2025-11-23 09:46:33.636 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.668 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.670 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.670 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.671 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.671 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:38 localhost nova_compute[281952]: 2025-11-23 09:46:38.674 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:41 localhost podman[240668]: time="2025-11-23T09:46:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:46:41 localhost podman[240668]: @ - - [23/Nov/2025:09:46:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 23 04:46:41 localhost podman[240668]: @ - - [23/Nov/2025:09:46:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1" Nov 23 04:46:43 localhost nova_compute[281952]: 2025-11-23 09:46:43.674 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:43 localhost podman[286850]: Nov 23 04:46:43 localhost podman[286850]: 2025-11-23 09:46:43.968163434 +0000 UTC m=+0.075306006 container create ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, io.buildah.version=1.33.12, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=553) Nov 23 04:46:44 localhost systemd[1]: Started libpod-conmon-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope. Nov 23 04:46:44 localhost systemd[1]: Started libcrun container. Nov 23 04:46:44 localhost podman[286850]: 2025-11-23 09:46:43.9344661 +0000 UTC m=+0.041608712 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:46:44 localhost podman[286850]: 2025-11-23 09:46:44.046035249 +0000 UTC m=+0.153177781 container init ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph) Nov 23 04:46:44 localhost systemd[1]: tmp-crun.ByAWGl.mount: Deactivated successfully. Nov 23 04:46:44 localhost podman[286850]: 2025-11-23 09:46:44.059693756 +0000 UTC m=+0.166836328 container start ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:46:44 localhost podman[286850]: 2025-11-23 09:46:44.060089488 +0000 UTC m=+0.167232050 container attach ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7) Nov 23 04:46:44 localhost nice_cartwright[286865]: 167 167 Nov 23 04:46:44 localhost podman[286850]: 2025-11-23 09:46:44.064165395 +0000 UTC m=+0.171308007 container died ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:46:44 localhost systemd[1]: libpod-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope: Deactivated successfully. Nov 23 04:46:44 localhost podman[286870]: 2025-11-23 09:46:44.13916087 +0000 UTC m=+0.066183860 container remove ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_cartwright, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Nov 23 04:46:44 localhost systemd[1]: libpod-conmon-ad7fa40cc96f79e8778f0b1ff9e403b66c9b89bbcd13ab9548fa1400c33b7fb8.scope: Deactivated successfully. Nov 23 04:46:44 localhost systemd[1]: Reloading. Nov 23 04:46:44 localhost podman[286876]: 2025-11-23 09:46:44.228187545 +0000 UTC m=+0.138404870 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 23 04:46:44 localhost podman[286876]: 2025-11-23 09:46:44.24020417 +0000 UTC m=+0.150421455 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm) Nov 23 04:46:44 localhost systemd-rc-local-generator[286927]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:46:44 localhost systemd-sysv-generator[286933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: var-lib-containers-storage-overlay-cc1e8b49fe197506885064802507ef8d24cd2ade0c0b1ad450dc6cb2383b9ee8-merged.mount: Deactivated successfully. Nov 23 04:46:44 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:46:44 localhost systemd[1]: Reloading. Nov 23 04:46:44 localhost systemd-sysv-generator[286975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:46:44 localhost systemd-rc-local-generator[286972]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:46:44 localhost systemd[1]: Starting Ceph mds.mds.np0005532585.jcltnl for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 04:46:45 localhost podman[287034]: Nov 23 04:46:45 localhost podman[287034]: 2025-11-23 09:46:45.33773309 +0000 UTC m=+0.067965296 container create 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Nov 23 04:46:45 localhost systemd[1]: tmp-crun.rmcvku.mount: Deactivated successfully. Nov 23 04:46:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:46:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 04:46:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:46:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f8e37fb7f8de7d9d21e9d243473907da004c7d2a8b0718674dd10f852820e2/merged/var/lib/ceph/mds/ceph-mds.np0005532585.jcltnl supports timestamps until 2038 (0x7fffffff) Nov 23 04:46:45 localhost podman[287034]: 2025-11-23 09:46:45.401729351 +0000 UTC m=+0.131961567 container init 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:46:45 localhost podman[287034]: 2025-11-23 09:46:45.306447282 +0000 UTC m=+0.036679518 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:46:45 localhost podman[287034]: 2025-11-23 09:46:45.411188167 +0000 UTC m=+0.141420383 container start 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:46:45 localhost bash[287034]: 5e5e70ecae20bc54331125f31c9a74b5166174847d74c83296a04ef3f64fa3d7 Nov 23 04:46:45 localhost systemd[1]: Started Ceph mds.mds.np0005532585.jcltnl for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 04:46:45 localhost ceph-mds[287052]: set uid:gid to 167:167 (ceph:ceph) Nov 23 04:46:45 localhost ceph-mds[287052]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Nov 23 04:46:45 localhost ceph-mds[287052]: main not setting numa affinity Nov 23 04:46:45 localhost ceph-mds[287052]: pidfile_write: ignore empty --pid-file Nov 23 04:46:45 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mds-mds-np0005532585-jcltnl[287048]: starting mds.mds.np0005532585.jcltnl at Nov 23 04:46:45 localhost ceph-mds[287052]: mds.mds.np0005532585.jcltnl Updating MDS map to version 7 from mon.0 Nov 23 04:46:46 localhost ceph-mds[287052]: mds.mds.np0005532585.jcltnl Updating MDS map to version 8 from mon.0 Nov 23 04:46:46 localhost ceph-mds[287052]: mds.mds.np0005532585.jcltnl Monitors have assigned me to become a standby. Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.729 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:48 localhost nova_compute[281952]: 2025-11-23 09:46:48.730 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:49 localhost systemd[1]: tmp-crun.NCQ8yZ.mount: Deactivated successfully. Nov 23 04:46:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:46:49 localhost podman[287199]: 2025-11-23 09:46:49.348007312 +0000 UTC m=+0.109781113 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, architecture=x86_64, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:46:49 localhost systemd[1]: tmp-crun.wiB9nM.mount: Deactivated successfully. Nov 23 04:46:49 localhost podman[287217]: 2025-11-23 09:46:49.443152047 +0000 UTC m=+0.086772024 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:46:49 localhost podman[287199]: 2025-11-23 09:46:49.47904271 +0000 UTC m=+0.240816491 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Nov 23 04:46:49 localhost podman[287217]: 2025-11-23 09:46:49.530381026 +0000 UTC m=+0.174001103 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:46:49 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:46:50 localhost systemd[1]: session-61.scope: Deactivated successfully. Nov 23 04:46:50 localhost systemd-logind[761]: Session 61 logged out. Waiting for processes to exit. Nov 23 04:46:50 localhost systemd-logind[761]: Removed session 61. Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.731 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4988-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.733 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:53 localhost nova_compute[281952]: 2025-11-23 09:46:53.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:46:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:46:56 localhost podman[287343]: 2025-11-23 09:46:56.05345408 +0000 UTC m=+0.089867491 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64) Nov 23 04:46:56 localhost systemd[1]: tmp-crun.kish34.mount: Deactivated successfully. Nov 23 04:46:56 localhost podman[287343]: 2025-11-23 09:46:56.099360795 +0000 UTC m=+0.135774196 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350) Nov 23 04:46:56 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:46:56 localhost podman[287342]: 2025-11-23 09:46:56.109468862 +0000 UTC m=+0.146231684 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:46:56 localhost podman[287342]: 2025-11-23 09:46:56.195523403 +0000 UTC m=+0.232286195 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:46:56 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:46:57 localhost podman[287388]: 2025-11-23 09:46:57.025583849 +0000 UTC m=+0.082243772 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:46:57 localhost podman[287388]: 2025-11-23 09:46:57.057224818 +0000 UTC m=+0.113884671 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0) Nov 23 04:46:57 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.759 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.783 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:46:58 localhost nova_compute[281952]: 2025-11-23 09:46:58.783 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:46:59 localhost openstack_network_exporter[242668]: ERROR 09:46:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:46:59 localhost openstack_network_exporter[242668]: ERROR 09:46:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:46:59 localhost openstack_network_exporter[242668]: ERROR 09:46:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:46:59 localhost openstack_network_exporter[242668]: ERROR 09:46:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:46:59 localhost openstack_network_exporter[242668]: Nov 23 04:46:59 localhost openstack_network_exporter[242668]: ERROR 09:46:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:46:59 localhost openstack_network_exporter[242668]: Nov 23 04:47:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:47:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:47:02 localhost podman[287407]: 2025-11-23 09:47:02.025665292 +0000 UTC m=+0.077153874 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:47:02 localhost podman[287407]: 2025-11-23 09:47:02.031482814 +0000 UTC m=+0.082971346 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:47:02 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:47:02 localhost podman[287406]: 2025-11-23 09:47:02.072831877 +0000 UTC m=+0.126910520 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:47:02 localhost podman[287406]: 2025-11-23 09:47:02.082734286 +0000 UTC m=+0.136812919 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 04:47:02 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.787 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.825 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:03 localhost nova_compute[281952]: 2025-11-23 09:47:03.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.368 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.369 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.389 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.390 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.390 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.949 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:47:04 localhost nova_compute[281952]: 2025-11-23 09:47:04.950 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.290 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.305 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.305 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.306 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.307 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.307 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.308 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.322 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.323 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.324 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.758 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.864 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:47:05 localhost nova_compute[281952]: 2025-11-23 09:47:05.865 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.096 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.098 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12258MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.098 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.099 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.177 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.177 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.178 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.730 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.737 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.755 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.758 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:47:06 localhost nova_compute[281952]: 2025-11-23 09:47:06.759 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.827 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.856 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:08 localhost nova_compute[281952]: 2025-11-23 09:47:08.857 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:47:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:47:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:47:09.283 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:47:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:47:09.284 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.806 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.811 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc4d7f9e-60dd-4572-9c5a-753e98c9e72d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.807997', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61dad960-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '153046d01261987070cb728de764f01b2d08d01e23aed81696329a8e0bbd0fd6'}]}, 'timestamp': '2025-11-23 09:47:10.812682', '_unique_id': 'a1f97662b7aa44e4a6294ead547a12e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.843 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd68c03f8-7bf9-40a3-9bc3-0c9d55c02eb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.815637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61df8e56-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '86c673bb84eafca19b8c0c032e01cea23a8507061039a59adff0f819903e9de5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.815637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61dfa3f0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '57c2f1bf6e8d5178b5dcda3e842a3c4707cc9956c7f378063cb264f1ca8d7d2c'}]}, 'timestamp': '2025-11-23 09:47:10.844054', '_unique_id': 'd85a36c13dad44d3844868b06fbf8190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.845 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4260166b-c13e-4ff0-ad72-186a3222914a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.846480', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e0170e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '8b24e470b00749410d28c51bc2fd35e5a77ea0d1d78c02332ae834c152efeee2'}]}, 'timestamp': '2025-11-23 09:47:10.847019', '_unique_id': '297337742640457783612de1a01fd73c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.849 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.860 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.860 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d96acc-d82c-46e6-ba19-6c5765aac249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.849156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e23232-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '98ab590cf7d23d3c211d4d71cdb79a59032c87a278cf77dc519d566a978119dd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.849156', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e2448e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': 'ca49b57a14b6052aa752b472040bdcea24790ddaf920b2a5aa6cb47b7472d74f'}]}, 'timestamp': '2025-11-23 09:47:10.861225', '_unique_id': '496b329fc17c468c90541f160d3e613a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.862 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.863 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5d72883-6e78-40ba-a9e2-491abb0b11eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:47:10.863951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '61e53cac-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.057662842, 'message_signature': '679f239ea21f233a91e8f91079a954c3a8efcd9bc29c9fa97ce5033f1be8bd6e'}]}, 'timestamp': '2025-11-23 09:47:10.880720', '_unique_id': '95cf2a96ebf846fe9f2cc8ca4d2f4bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f761a3e-7a1e-444c-b13c-d322fb5792da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.883108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e5ac8c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '4342d2a4c0c16110ffcfee919457f1c7476c0023adf47467f65eb338ba771bae'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.883108', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e5c082-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '9710c1209b349c199dfd40a365e90fe1f862d191f9164fd333ffd83d292c078c'}]}, 'timestamp': '2025-11-23 09:47:10.884091', '_unique_id': 'e90304783c6b4b4e82e72d6934d3e141'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8491bf1b-8cd4-4caa-ae89-4f0c7689a8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.886226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e62626-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '5e13bf1227d8f4b6024cc4283bfde79f30f72b27b5e7b702cb71fd55d8a575e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.886226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e63878-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': 'd167241da8f7dc2b28dfac293c247484216085f74223e34a831fcb550cf29454'}]}, 'timestamp': '2025-11-23 09:47:10.887130', '_unique_id': 'e300482d2985481a8b3d4fb50b8b594c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a0242c-cbd2-4d89-88af-8c0445e63993', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.889228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e69b4c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '254638100d0a020764115ed6e8702dabaaaed5b9a6d51dca9fd2df8725866a13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.889228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e6aace-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '8558273ff6765667f5365165c6d31073de4eceb1d93cb2939bdc2d9127f5fb83'}]}, 'timestamp': '2025-11-23 09:47:10.890084', '_unique_id': '22e0c66dcbe0423aac806ee5319027ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.891 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.892 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5a1afc9-ede7-4b79-b897-96be4e2501e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.892206', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e70ffa-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '4d7cb9dcb96cd45a6b733483b30b50d13b02d6b11ee06bf98aef5c5b187f9fa7'}]}, 'timestamp': '2025-11-23 09:47:10.892844', '_unique_id': '44796cc61ebd4525b2c41eca2d75adf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 12260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26c6ac29-3994-494e-91d6-2edad82bd220', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12260000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:47:10.895054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '61e77eb8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.057662842, 'message_signature': '145086982d46d066ff050cf7adcf304a4cfc3fe0515f338e7e49f78f01f48c3d'}]}, 'timestamp': '2025-11-23 09:47:10.895489', '_unique_id': '67acf687a9ea466ca0e2a11c9ba75ece'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b691d083-03b5-4bd3-8450-0e1cb7cb8e6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.898020', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e7f4b0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '9fbc1246c075b5485fe803ea5ef9e0c1c1e9aacba2b9f33c7ef98d42d6de2b35'}]}, 'timestamp': '2025-11-23 09:47:10.898527', '_unique_id': '72b7cc3f93e248efa9e217656aa9e000'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.901 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12c70c5b-7ac4-48c3-93dd-fb65168546db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.900614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e85806-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': 'c3dab2fec701e0986424bbb1365c6a5e31b9ece3ff7789f7730a5407feabbe47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.900614', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e86918-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11174.026796897, 'message_signature': '1c8f01564471c88c7f5bd35e487e4e5169a1731334574fb40889c2209cea70e5'}]}, 'timestamp': '2025-11-23 09:47:10.901472', '_unique_id': '851e90c2809746008ee1f56124c8f721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '678b2ed3-f49f-421f-9eb9-11df3e6a4fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.904059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e8de7a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '73c7d911c792986671ee21a713529438f4b7162aa3d46ac912c200d7b36fc36e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.904059', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e8edde-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '9f2637535b122e0ce2aa4fe58eca8286888adab63681ad69102a78074cde263d'}]}, 'timestamp': '2025-11-23 09:47:10.905050', '_unique_id': 'e8064fe8655f40ff8e75d246239575a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd85aca62-69df-4952-8929-56b042b025c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.907158', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61e95c6a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '64715d82f5f63dbd0982227e8a5d543c38c7eee384b92587ce4dbb3d8cbc1bb1'}]}, 'timestamp': '2025-11-23 09:47:10.907734', '_unique_id': '3d0ccc921c2d4380b8b02d8a36f3104d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7205eb91-c94f-4725-b6d3-0c5dc77ce618', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.909823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61e9c13c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '983be157d49c15fb35bdfcd5f110dc3c342596a0952a4d0436617b204d9a6cd2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.909823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61e9d0c8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '56876d15094b89e90e8edf99e822e88dfc381879797d7d0cef9843d04b4d77ee'}]}, 'timestamp': '2025-11-23 09:47:10.910704', '_unique_id': 'f850df4dfedc4c63b7c2c35e32cf69e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9977cf38-6fb9-460f-acfa-61db44026d48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.912764', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61ea33d8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': '638a41587943ee3ade8da5092f58891d605326204b0a1e1207d8a9833b81320c'}]}, 'timestamp': '2025-11-23 09:47:10.913246', '_unique_id': '39fde533472a439d9b7a192d423454e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5598d5ab-f7d6-4320-b6d3-b8d98707ffa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:47:10.915424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '61ea9a58-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': 'cd8bd65d1f5eec5668234e95c6b41f298e36000c4d5eae1893d45b04e7c262ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:47:10.915424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61eaab2e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.993269448, 'message_signature': '65ff701e6a61a356fd421ab4102193856c9f28671b0a488efa689feb22ee4bd3'}]}, 'timestamp': '2025-11-23 09:47:10.916274', '_unique_id': '6e2e8753b79b4501ab9e34a8ceebc52c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b37361-8188-4d96-b083-2460af3555ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.918363', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eb0d76-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'c33359896eeee6a9d2604b6d6c0e4220031aa91f95c48be60580453d2bcc488f'}]}, 'timestamp': '2025-11-23 09:47:10.918849', '_unique_id': '85c1018ac76f4dffb3e48ab22f74a239'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81efef4-12df-4fb0-877c-bc665376d0ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.920654', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eb635c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'c85025e299b41aad205957ec6936dc9c360a7a49678a211924ce7d1f6d830cf4'}]}, 'timestamp': '2025-11-23 09:47:10.920957', '_unique_id': 'd97d4a8b44d94335801114ad6a994f1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a36214-4971-4798-bd82-22ac83e3dd84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.922209', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61eba01a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'e0316ca16bf0b71f1673f53800d24348d37e906122223457656d4dcef6c20c52'}]}, 'timestamp': '2025-11-23 09:47:10.922486', '_unique_id': '666bf906010c44eb98a0d86bfa9350c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2eb93ee-b0b4-4243-85be-b6a7385daeed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:47:10.923794', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '61ebdfda-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11173.985624049, 'message_signature': 'bd0306c9dfe5ad580974be76c41725007d39b37c9d773b6c7db1207cbf0968db'}]}, 'timestamp': '2025-11-23 09:47:10.924122', '_unique_id': '24e6e3810e214f8b99bf0907d7fdf285'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:47:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:47:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:47:11 localhost podman[240668]: time="2025-11-23T09:47:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:47:11 localhost podman[240668]: @ - - [23/Nov/2025:09:47:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 23 04:47:11 localhost podman[240668]: @ - - [23/Nov/2025:09:47:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17717 "" "Go-http-client/1.1" Nov 23 04:47:13 localhost nova_compute[281952]: 2025-11-23 09:47:13.858 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:47:15 localhost podman[287491]: 2025-11-23 09:47:15.026047936 +0000 UTC m=+0.082237373 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute) Nov 23 04:47:15 localhost podman[287491]: 2025-11-23 09:47:15.035231253 +0000 UTC m=+0.091420720 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:47:15 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.861 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.863 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:18 localhost nova_compute[281952]: 2025-11-23 09:47:18.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:47:20 localhost podman[287510]: 2025-11-23 09:47:20.026000666 +0000 UTC m=+0.078375142 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:47:20 localhost podman[287510]: 2025-11-23 09:47:20.033826551 +0000 UTC m=+0.086200997 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:47:20 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.908 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:23 localhost nova_compute[281952]: 2025-11-23 09:47:23.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:47:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:47:26 localhost podman[287552]: 2025-11-23 09:47:26.711267824 +0000 UTC m=+0.074232903 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:47:26 localhost podman[287553]: 2025-11-23 09:47:26.771213158 +0000 UTC m=+0.125372151 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container) Nov 23 04:47:26 localhost podman[287553]: 2025-11-23 09:47:26.783196754 +0000 UTC m=+0.137355707 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 04:47:26 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:47:26 localhost podman[287552]: 2025-11-23 09:47:26.824805154 +0000 UTC m=+0.187770233 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:47:26 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:47:27 localhost systemd[1]: session-62.scope: Deactivated successfully. Nov 23 04:47:27 localhost systemd[1]: session-62.scope: Consumed 1.256s CPU time. Nov 23 04:47:27 localhost systemd-logind[761]: Session 62 logged out. Waiting for processes to exit. Nov 23 04:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:47:27 localhost systemd-logind[761]: Removed session 62. Nov 23 04:47:27 localhost podman[287597]: 2025-11-23 09:47:27.389776841 +0000 UTC m=+0.085811175 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:47:27 localhost podman[287597]: 2025-11-23 09:47:27.421121511 +0000 UTC m=+0.117155825 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:47:27 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.946 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:28 localhost nova_compute[281952]: 2025-11-23 09:47:28.989 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:29 localhost openstack_network_exporter[242668]: ERROR 09:47:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:47:29 localhost openstack_network_exporter[242668]: ERROR 09:47:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:47:29 localhost openstack_network_exporter[242668]: ERROR 09:47:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:47:29 localhost openstack_network_exporter[242668]: ERROR 09:47:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:47:29 localhost openstack_network_exporter[242668]: Nov 23 04:47:29 localhost openstack_network_exporter[242668]: ERROR 09:47:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:47:29 localhost openstack_network_exporter[242668]: Nov 23 04:47:31 localhost sshd[287614]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:47:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:47:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:47:32 localhost systemd[1]: tmp-crun.L1jFdl.mount: Deactivated successfully. Nov 23 04:47:32 localhost podman[287616]: 2025-11-23 09:47:32.306972271 +0000 UTC m=+0.089187550 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd) Nov 23 04:47:32 localhost podman[287616]: 2025-11-23 09:47:32.317762089 +0000 UTC m=+0.099977388 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 23 04:47:32 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:47:32 localhost systemd[1]: tmp-crun.V7sbez.mount: Deactivated successfully. Nov 23 04:47:32 localhost podman[287617]: 2025-11-23 09:47:32.374846094 +0000 UTC m=+0.152920373 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:47:32 localhost podman[287617]: 2025-11-23 09:47:32.382856504 +0000 UTC m=+0.160930793 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:47:32 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:47:33 localhost nova_compute[281952]: 2025-11-23 09:47:33.989 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:33 localhost nova_compute[281952]: 2025-11-23 09:47:33.991 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:33 localhost nova_compute[281952]: 2025-11-23 09:47:33.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:33 localhost nova_compute[281952]: 2025-11-23 09:47:33.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:34 localhost nova_compute[281952]: 2025-11-23 09:47:34.019 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:34 localhost nova_compute[281952]: 2025-11-23 09:47:34.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:37 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 23 04:47:37 localhost systemd[286223]: Activating special unit Exit the Session... Nov 23 04:47:37 localhost systemd[286223]: Stopped target Main User Target. Nov 23 04:47:37 localhost systemd[286223]: Stopped target Basic System. Nov 23 04:47:37 localhost systemd[286223]: Stopped target Paths. Nov 23 04:47:37 localhost systemd[286223]: Stopped target Sockets. Nov 23 04:47:37 localhost systemd[286223]: Stopped target Timers. Nov 23 04:47:37 localhost systemd[286223]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 23 04:47:37 localhost systemd[286223]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 04:47:37 localhost systemd[286223]: Closed D-Bus User Message Bus Socket. Nov 23 04:47:37 localhost systemd[286223]: Stopped Create User's Volatile Files and Directories. Nov 23 04:47:37 localhost systemd[286223]: Removed slice User Application Slice. Nov 23 04:47:37 localhost systemd[286223]: Reached target Shutdown. Nov 23 04:47:37 localhost systemd[286223]: Finished Exit the Session. Nov 23 04:47:37 localhost systemd[286223]: Reached target Exit the Session. Nov 23 04:47:37 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 23 04:47:37 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 23 04:47:37 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 23 04:47:37 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 23 04:47:37 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 23 04:47:37 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 23 04:47:37 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 23 04:47:37 localhost systemd[1]: user-1003.slice: Consumed 1.665s CPU time. Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.023 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:39 localhost nova_compute[281952]: 2025-11-23 09:47:39.052 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:41 localhost podman[240668]: time="2025-11-23T09:47:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:47:41 localhost podman[240668]: @ - - [23/Nov/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 23 04:47:41 localhost podman[240668]: @ - - [23/Nov/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17716 "" "Go-http-client/1.1" Nov 23 04:47:44 localhost nova_compute[281952]: 2025-11-23 09:47:44.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:47:46 localhost podman[287781]: 2025-11-23 09:47:46.027634338 +0000 UTC m=+0.078395822 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible) Nov 23 04:47:46 localhost podman[287781]: 2025-11-23 09:47:46.041691808 +0000 UTC m=+0.092453322 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:47:46 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.087 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:49 localhost nova_compute[281952]: 2025-11-23 09:47:49.087 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:47:51 localhost podman[287799]: 2025-11-23 09:47:51.031600821 +0000 UTC m=+0.081381285 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:47:51 localhost podman[287799]: 2025-11-23 09:47:51.069466846 +0000 UTC m=+0.119247280 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:47:51 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.088 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.118 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:54 localhost nova_compute[281952]: 2025-11-23 09:47:54.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:47:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:47:57 localhost systemd[1]: tmp-crun.3HYSfZ.mount: Deactivated successfully. Nov 23 04:47:57 localhost podman[287822]: 2025-11-23 09:47:57.034863805 +0000 UTC m=+0.087674503 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible) Nov 23 04:47:57 localhost podman[287823]: 2025-11-23 09:47:57.106290558 +0000 UTC m=+0.155170023 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 23 04:47:57 localhost podman[287822]: 2025-11-23 09:47:57.110293223 +0000 UTC m=+0.163103991 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller) Nov 23 04:47:57 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:47:57 localhost podman[287823]: 2025-11-23 09:47:57.122581508 +0000 UTC m=+0.171460973 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git) Nov 23 04:47:57 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:47:58 localhost podman[287867]: 2025-11-23 09:47:58.025402729 +0000 UTC m=+0.079803337 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:47:58 localhost podman[287867]: 2025-11-23 09:47:58.034461382 +0000 UTC m=+0.088861990 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:47:58 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.120 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.149 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 04:47:59 localhost nova_compute[281952]: 2025-11-23 09:47:59.246 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:47:59 localhost openstack_network_exporter[242668]: ERROR 09:47:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:47:59 localhost openstack_network_exporter[242668]: ERROR 09:47:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:47:59 localhost openstack_network_exporter[242668]: ERROR 09:47:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:47:59 localhost openstack_network_exporter[242668]: ERROR 09:47:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:47:59 localhost openstack_network_exporter[242668]: Nov 23 04:47:59 localhost openstack_network_exporter[242668]: ERROR 09:47:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:47:59 localhost openstack_network_exporter[242668]: Nov 23 04:48:00 localhost nova_compute[281952]: 2025-11-23 09:48:00.257 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.701 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.785 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:48:01 localhost nova_compute[281952]: 2025-11-23 09:48:01.786 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.022 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.024 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=12247MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.025 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.138 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.139 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.139 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.297 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.749 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.758 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.776 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.778 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:48:02 localhost nova_compute[281952]: 2025-11-23 09:48:02.779 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:48:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:48:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:48:03 localhost podman[287929]: 2025-11-23 09:48:03.041883486 +0000 UTC m=+0.094622560 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:48:03 localhost podman[287929]: 2025-11-23 09:48:03.055632386 +0000 UTC m=+0.108371470 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 04:48:03 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:48:03 localhost systemd[1]: tmp-crun.kqYJx6.mount: Deactivated successfully. Nov 23 04:48:03 localhost podman[287930]: 2025-11-23 09:48:03.148799749 +0000 UTC m=+0.195752562 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:48:03 localhost podman[287930]: 2025-11-23 09:48:03.156408916 +0000 UTC m=+0.203361739 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:48:03 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.780 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.781 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.781 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.782 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.985 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.986 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.986 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:48:04 localhost nova_compute[281952]: 2025-11-23 09:48:04.987 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.503 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.520 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.520 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.521 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.522 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.522 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.523 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:05 localhost nova_compute[281952]: 2025-11-23 09:48:05.523 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:48:06 localhost nova_compute[281952]: 2025-11-23 09:48:06.217 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:09 localhost nova_compute[281952]: 2025-11-23 09:48:09.210 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:48:09.284 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:48:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:48:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:48:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:48:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:48:11 localhost podman[240668]: time="2025-11-23T09:48:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:48:11 localhost podman[240668]: @ - - [23/Nov/2025:09:48:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 23 04:48:11 localhost podman[240668]: @ - - [23/Nov/2025:09:48:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17721 "" "Go-http-client/1.1" Nov 23 04:48:13 localhost podman[288102]: Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.788044279 +0000 UTC m=+0.050857823 container create 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., release=553) Nov 23 04:48:13 localhost systemd[1]: Started libpod-conmon-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope. Nov 23 04:48:13 localhost systemd[1]: Started libcrun container. Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.859156532 +0000 UTC m=+0.121970006 container init 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main) Nov 23 04:48:13 localhost systemd[1]: tmp-crun.PevKIl.mount: Deactivated successfully. Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.769317633 +0000 UTC m=+0.032131087 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.87092462 +0000 UTC m=+0.133738094 container start 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True) Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.871188608 +0000 UTC m=+0.134002092 container attach 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, distribution-scope=public, ceph=True, version=7, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:48:13 localhost relaxed_rhodes[288117]: 167 167 Nov 23 04:48:13 localhost systemd[1]: libpod-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope: Deactivated successfully. Nov 23 04:48:13 localhost podman[288102]: 2025-11-23 09:48:13.874051277 +0000 UTC m=+0.136864781 container died 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, GIT_CLEAN=True, release=553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public) Nov 23 04:48:13 localhost podman[288122]: 2025-11-23 09:48:13.956058362 +0000 UTC m=+0.071512007 container remove 14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_rhodes, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., release=553, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:48:13 localhost systemd[1]: libpod-conmon-14e1cfe353a7f86a5d0f527381242d1b51c916a5e2c7f67ae04760998e513cd3.scope: Deactivated successfully. Nov 23 04:48:14 localhost systemd[1]: Reloading. Nov 23 04:48:14 localhost systemd-rc-local-generator[288160]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:48:14 localhost systemd-sysv-generator[288166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:14 localhost nova_compute[281952]: 2025-11-23 09:48:14.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: var-lib-containers-storage-overlay-accaf8fa8c3faa14bcdf046d7f01e8737ee63d93c9f6e2663ab0e78253c9978c-merged.mount: Deactivated successfully. Nov 23 04:48:14 localhost systemd[1]: Reloading. Nov 23 04:48:14 localhost systemd-rc-local-generator[288205]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:48:14 localhost systemd-sysv-generator[288209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:14 localhost systemd[1]: Starting Ceph mgr.np0005532585.gzafiw for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 04:48:15 localhost podman[288268]: Nov 23 04:48:15 localhost podman[288268]: 2025-11-23 09:48:15.226659103 +0000 UTC m=+0.081293953 container create ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, RELEASE=main, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Nov 23 04:48:15 localhost systemd[1]: tmp-crun.DoMo0M.mount: Deactivated successfully. Nov 23 04:48:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a92915ed4cea71666809e6a486bdee7aae1fb66120f79c15d3a53190fd2ea8dc/merged/var/lib/ceph/mgr/ceph-np0005532585.gzafiw supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:15 localhost podman[288268]: 2025-11-23 09:48:15.194483648 +0000 UTC m=+0.049118528 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:48:15 localhost podman[288268]: 2025-11-23 09:48:15.300660618 +0000 UTC m=+0.155295428 container init ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-type=git) Nov 23 04:48:15 localhost podman[288268]: 2025-11-23 09:48:15.311003921 +0000 UTC m=+0.165638731 container start ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64) Nov 23 04:48:15 localhost bash[288268]: ccb4b02da0d94a98b1ac6309347a6968fddd4d8454487ea7301093a34c10d98a Nov 23 04:48:15 localhost systemd[1]: Started Ceph mgr.np0005532585.gzafiw for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 04:48:15 localhost ceph-mgr[288287]: set uid:gid to 167:167 (ceph:ceph) Nov 23 04:48:15 localhost ceph-mgr[288287]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 23 04:48:15 localhost ceph-mgr[288287]: pidfile_write: ignore empty --pid-file Nov 23 04:48:15 localhost ceph-mgr[288287]: mgr[py] Loading python module 'alerts' Nov 23 04:48:15 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:15.506+0000 7f5f26c11140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 23 04:48:15 localhost ceph-mgr[288287]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 23 04:48:15 localhost ceph-mgr[288287]: mgr[py] Loading python module 'balancer' Nov 23 04:48:15 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:15.577+0000 7f5f26c11140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 23 04:48:15 localhost ceph-mgr[288287]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 23 04:48:15 localhost ceph-mgr[288287]: mgr[py] Loading python module 'cephadm' Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'crash' Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.220+0000 7f5f26c11140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'dashboard' Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'devicehealth' Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.769+0000 7f5f26c11140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'diskprediction_local' Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: from numpy import show_config as show_numpy_config Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.911+0000 7f5f26c11140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'influx' Nov 23 04:48:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:48:16 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:16.974+0000 7f5f26c11140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 23 04:48:16 localhost ceph-mgr[288287]: mgr[py] Loading python module 'insights' Nov 23 04:48:17 localhost podman[288317]: 2025-11-23 09:48:17.034678061 +0000 UTC m=+0.090078719 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'iostat' Nov 23 04:48:17 localhost podman[288317]: 2025-11-23 09:48:17.071390609 +0000 UTC m=+0.126791297 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 23 04:48:17 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:48:17 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:17.094+0000 7f5f26c11140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'k8sevents' Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'localpool' Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'mds_autoscaler' Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'mirroring' Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'nfs' Nov 23 04:48:17 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:17.864+0000 7f5f26c11140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 23 04:48:17 localhost ceph-mgr[288287]: mgr[py] Loading python module 'orchestrator' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.008+0000 7f5f26c11140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'osd_perf_query' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.077+0000 7f5f26c11140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'osd_support' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.144+0000 7f5f26c11140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'pg_autoscaler' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.220+0000 7f5f26c11140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'progress' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.279+0000 7f5f26c11140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'prometheus' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.581+0000 7f5f26c11140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rbd_support' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.661+0000 7f5f26c11140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'restful' Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rgw' Nov 23 04:48:18 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:18.987+0000 7f5f26c11140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 23 04:48:18 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rook' Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.228 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:19 localhost nova_compute[281952]: 2025-11-23 09:48:19.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:19 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.475+0000 7f5f26c11140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'selftest' Nov 23 04:48:19 localhost systemd[1]: tmp-crun.iBZHk6.mount: Deactivated successfully. Nov 23 04:48:19 localhost podman[288458]: 2025-11-23 09:48:19.524738314 +0000 UTC m=+0.108331578 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:48:19 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.544+0000 7f5f26c11140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'snap_schedule' Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'stats' Nov 23 04:48:19 localhost podman[288458]: 2025-11-23 09:48:19.637927934 +0000 UTC m=+0.221521138 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, release=553, GIT_BRANCH=main, name=rhceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12) Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'status' Nov 23 04:48:19 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.755+0000 7f5f26c11140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'telegraf' Nov 23 04:48:19 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.818+0000 7f5f26c11140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'telemetry' Nov 23 04:48:19 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:19.955+0000 7f5f26c11140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 23 04:48:19 localhost ceph-mgr[288287]: mgr[py] Loading python module 'test_orchestrator' Nov 23 04:48:20 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.113+0000 7f5f26c11140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: mgr[py] Loading python module 'volumes' Nov 23 04:48:20 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.327+0000 7f5f26c11140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: mgr[py] Loading python module 'zabbix' Nov 23 04:48:20 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:48:20.387+0000 7f5f26c11140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 23 04:48:20 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 23 04:48:20 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1698075890 Nov 23 04:48:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:48:21 localhost podman[288596]: 2025-11-23 09:48:21.265316982 +0000 UTC m=+0.089705676 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:48:21 localhost podman[288596]: 2025-11-23 09:48:21.304414325 +0000 UTC m=+0.128802979 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:48:21 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:48:21 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1698075890 Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:24 localhost nova_compute[281952]: 2025-11-23 09:48:24.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:48:27 localhost podman[288832]: 2025-11-23 09:48:27.256114415 +0000 UTC m=+0.090494190 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 04:48:27 localhost podman[288833]: 2025-11-23 09:48:27.35059309 +0000 UTC m=+0.178591896 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Nov 23 04:48:27 localhost podman[288833]: 2025-11-23 09:48:27.360591303 +0000 UTC m=+0.188590149 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 23 04:48:27 localhost podman[288832]: 2025-11-23 09:48:27.368797848 +0000 UTC m=+0.203177663 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:48:27 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:48:27 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:48:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:48:28 localhost podman[289090]: 2025-11-23 09:48:28.201195048 +0000 UTC m=+0.079932991 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:48:28 localhost podman[289090]: 2025-11-23 09:48:28.207465354 +0000 UTC m=+0.086203307 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 23 04:48:28 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.336 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:29 localhost nova_compute[281952]: 2025-11-23 09:48:29.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:29 localhost openstack_network_exporter[242668]: ERROR 09:48:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:48:29 localhost openstack_network_exporter[242668]: Nov 23 04:48:29 localhost openstack_network_exporter[242668]: ERROR 09:48:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:48:29 localhost openstack_network_exporter[242668]: ERROR 09:48:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:48:29 localhost openstack_network_exporter[242668]: ERROR 09:48:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:48:29 localhost openstack_network_exporter[242668]: ERROR 09:48:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:48:29 localhost openstack_network_exporter[242668]: Nov 23 04:48:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:48:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:48:34 localhost podman[289375]: 2025-11-23 09:48:34.047563846 +0000 UTC m=+0.097089758 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:48:34 localhost podman[289375]: 2025-11-23 09:48:34.061727328 +0000 UTC m=+0.111253220 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:48:34 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:48:34 localhost podman[289376]: 2025-11-23 09:48:34.111662749 +0000 UTC m=+0.154093069 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:48:34 localhost podman[289376]: 2025-11-23 09:48:34.150671789 +0000 UTC m=+0.193102109 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:48:34 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.376 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:34 localhost nova_compute[281952]: 2025-11-23 09:48:34.377 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:35 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 23 04:48:35 localhost sshd[289468]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:48:36 localhost podman[289495]: Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.095662429 +0000 UTC m=+0.086935019 container create db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:48:36 localhost systemd[1]: Started libpod-conmon-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope. Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.059618462 +0000 UTC m=+0.050891042 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:48:36 localhost systemd[1]: Started libcrun container. Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.176317761 +0000 UTC m=+0.167590301 container init db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public) Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.187756239 +0000 UTC m=+0.179028779 container start db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.188667428 +0000 UTC m=+0.179940008 container attach db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Nov 23 04:48:36 localhost eager_goldberg[289510]: 167 167 Nov 23 04:48:36 localhost systemd[1]: libpod-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope: Deactivated successfully. Nov 23 04:48:36 localhost podman[289495]: 2025-11-23 09:48:36.193296592 +0000 UTC m=+0.184569152 container died db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64) Nov 23 04:48:36 localhost podman[289515]: 2025-11-23 09:48:36.291492283 +0000 UTC m=+0.086145055 container remove db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_goldberg, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:48:36 localhost systemd[1]: libpod-conmon-db13691dcceec69b13e758bf26ee3b8347dc3d08586b3df0c4e3f2bc1e53463b.scope: Deactivated successfully. Nov 23 04:48:36 localhost podman[289533]: Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.427256428 +0000 UTC m=+0.088092475 container create c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:48:36 localhost systemd[1]: Started libpod-conmon-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope. Nov 23 04:48:36 localhost systemd[1]: Started libcrun container. Nov 23 04:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5ea3d17c44dec98808d2b7cf20c7e10666d999ea3d9a9c74c372bba98fd756e/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.389070724 +0000 UTC m=+0.049906811 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.494401078 +0000 UTC m=+0.155237125 container init c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7) Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.507085324 +0000 UTC m=+0.167921371 container start c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12) Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.507422015 +0000 UTC m=+0.168258082 container attach c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , architecture=x86_64) Nov 23 04:48:36 localhost systemd[1]: libpod-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope: Deactivated successfully. Nov 23 04:48:36 localhost podman[289533]: 2025-11-23 09:48:36.602454477 +0000 UTC m=+0.263290504 container died c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, name=rhceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:48:36 localhost podman[289574]: 2025-11-23 09:48:36.692489832 +0000 UTC m=+0.077994770 container remove c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_franklin, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:48:36 localhost systemd[1]: libpod-conmon-c67aaf8112ed3dec546aa13bd91576506074cfc343c7ae51bd5bf9aecf25a6cc.scope: Deactivated successfully. Nov 23 04:48:36 localhost systemd[1]: Reloading. Nov 23 04:48:36 localhost systemd-rc-local-generator[289611]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:48:36 localhost systemd-sysv-generator[289614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:48:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: var-lib-containers-storage-overlay-389a2e189f6cc00e3a32be8e29ed54e28047e97a87c7dc64b9273f56e5f484f1-merged.mount: Deactivated successfully. Nov 23 04:48:37 localhost systemd[1]: Reloading. Nov 23 04:48:37 localhost systemd-rc-local-generator[289653]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:48:37 localhost systemd-sysv-generator[289659]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:48:37 localhost systemd[1]: Starting Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 04:48:37 localhost podman[289717]: Nov 23 04:48:38 localhost podman[289717]: 2025-11-23 09:48:38.000836374 +0000 UTC m=+0.087882239 container create 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Nov 23 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff) Nov 23 04:48:38 localhost podman[289717]: 2025-11-23 09:48:37.966005034 +0000 UTC m=+0.053050879 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:48:38 localhost podman[289717]: 2025-11-23 09:48:38.068004025 +0000 UTC m=+0.155049870 container init 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, release=553, maintainer=Guillaume Abrioux , version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:48:38 localhost podman[289717]: 2025-11-23 09:48:38.084236391 +0000 UTC m=+0.171282236 container start 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux ) Nov 23 04:48:38 localhost bash[289717]: 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 Nov 23 04:48:38 localhost systemd[1]: Started Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 04:48:38 localhost ceph-mon[289735]: set uid:gid to 167:167 (ceph:ceph) Nov 23 04:48:38 localhost ceph-mon[289735]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Nov 23 04:48:38 localhost ceph-mon[289735]: pidfile_write: ignore empty --pid-file Nov 23 04:48:38 localhost ceph-mon[289735]: load: jerasure load: lrc Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: RocksDB version: 7.9.2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Git sha 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: DB SUMMARY Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: DB Session ID: 8ON8PRI8V1RJ4RVNWHFL Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: CURRENT file: CURRENT Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: IDENTITY file: IDENTITY Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005532585/store.db dir, Total Num: 0, files: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005532585/store.db: 000004.log size: 761 ; Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.error_if_exists: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.create_if_missing: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.paranoid_checks: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.env: 0x559413d149e0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.fs: PosixFileSystem Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.info_log: 0x5594163ccd20 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.statistics: (nil) Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.use_fsync: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_log_file_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_fallocate: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.use_direct_reads: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.create_missing_column_families: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.db_log_dir: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.wal_dir: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.advise_random_on_open: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.write_buffer_manager: 0x5594163dd540 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.rate_limiter: (nil) Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.unordered_write: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.row_cache: None Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.wal_filter: None Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.two_write_queues: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.manual_wal_flush: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.wal_compression: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.atomic_flush: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.log_readahead_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.db_host_id: __hostname__ Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_background_jobs: 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_background_compactions: -1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_subcompactions: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_total_wal_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_open_files: -1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bytes_per_sync: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_readahead_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_background_flushes: -1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Compression algorithms supported: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kZSTD supported: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kXpressCompression supported: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kZlibCompression supported: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.merge_operator: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_filter: None Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_filter_factory: None Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.sst_partitioner_factory: None Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5594163cc980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5594163c9350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.write_buffer_size: 33554432 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_write_buffer_number: 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression: NoCompression Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression: Disabled Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.prefix_extractor: nullptr Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.num_levels: 7 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.level: 32767 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.enabled: false Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_base: 268435456 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.arena_block_size: 1048576 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.table_properties_collectors: Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.inplace_update_support: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.bloom_locality: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.max_successive_merges: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.force_consistency_checks: 1 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.ttl: 2592000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enable_blob_files: false Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.min_blob_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_file_size: 268435456 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: f90877de-8e0c-4aa9-bd89-60d6d2f6e09f Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318139479, "job": 1, "event": "recovery_started", "wal_files": [4]} Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318142069, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891318142203, "job": 1, "event": "recovery_finished"} Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5594163f0e00 Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: DB pointer 0x5594164e6000 Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585 does not exist in monmap, will attempt to join an existing cluster Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:48:38 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.84 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.84 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5594163c9350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)#012#012** File Read Latency Histogram By Level [default] ** Nov 23 04:48:38 localhost systemd[1]: tmp-crun.npBrnb.mount: Deactivated successfully. Nov 23 04:48:38 localhost ceph-mon[289735]: using public_addr v2:172.18.0.107:0/0 -> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] Nov 23 04:48:38 localhost ceph-mon[289735]: starting mon.np0005532585 rank -1 at public addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] at bind addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005532585 fsid 46550e70-79cb-5f55-bf6d-1204b97e083b Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(???) e0 preinit fsid 46550e70-79cb-5f55-bf6d-1204b97e083b Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing) e4 sync_obtain_latest_monmap Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing) e4 sync_obtain_latest_monmap obtained monmap e4 Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).mds e16 new map Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T08:00:26.486221+0000#012modified#0112025-11-23T09:47:19.846415+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26392}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26392 members: 26392#012[mds.mds.np0005532586.mfohsb{0:26392} state up:active seq 12 addr [v2:172.18.0.108:6808/2718449296,v1:172.18.0.108:6809/2718449296] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005532585.jcltnl{-1:17133} state up:standby seq 1 addr [v2:172.18.0.107:6808/563301557,v1:172.18.0.107:6809/563301557] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005532584.aoxjmw{-1:17139} state up:standby seq 1 addr [v2:172.18.0.106:6808/2261302276,v1:172.18.0.106:6809/2261302276] compat {c=[1],r=[1],i=[17ff]}] Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 3314933000852226048, adjusting msgr requires Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005532583.nwcrcp"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005532583.nwcrcp"}]': finished Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Removing key for mds.mds.np0005532583.nwcrcp Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mgr to host np0005532584.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mgr to host np0005532585.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mgr to host np0005532586.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Saving service mgr spec with placement label:mgr Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 23 04:48:38 localhost ceph-mon[289735]: Deploying daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 23 04:48:38 localhost ceph-mon[289735]: Deploying daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532581.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532581.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: Deploying daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532582.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532582.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532583.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532583.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532584.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532584.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532585.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532585.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: Added label mon to host np0005532586.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Added label _admin to host np0005532586.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: Saving service mon spec with placement label:mon Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:38 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:48:38 localhost ceph-mon[289735]: Deploying daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:48:38 localhost ceph-mon[289735]: mon.np0005532585@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.377 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.380 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.380 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:39 localhost nova_compute[281952]: 2025-11-23 09:48:39.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:41 localhost podman[240668]: time="2025-11-23T09:48:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:48:41 localhost podman[240668]: @ - - [23/Nov/2025:09:48:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:48:41 localhost podman[240668]: @ - - [23/Nov/2025:09:48:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18670 "" "Go-http-client/1.1" Nov 23 04:48:42 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 23 04:48:43 localhost ceph-mon[289735]: mon.np0005532585@-1(probing) e4 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 23 04:48:43 localhost ceph-mon[289735]: mon.np0005532585@-1(probing) e4 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 23 04:48:44 localhost sshd[289774]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:48:44 localhost ceph-mon[289735]: mon.np0005532585@-1(probing) e5 my rank is now 4 (was -1) Nov 23 04:48:44 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:48:44 localhost ceph-mon[289735]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 23 04:48:44 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:44 localhost nova_compute[281952]: 2025-11-23 09:48:44.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:45 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 23 04:48:47 localhost ceph-mon[289735]: Deploying daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532581 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586 in quorum (ranks 0,1,2,3) Nov 23 04:48:47 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:48:47 localhost ceph-mon[289735]: Deploying daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e5 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:47 localhost ceph-mon[289735]: mgrc update_daemon_metadata mon.np0005532585 metadata {addrs=[v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005532585.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005532585.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532581 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586,np0005532585 in quorum (ranks 0,1,2,3,4) Nov 23 04:48:47 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:47 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e5 adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints Nov 23 04:48:47 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 23 04:48:47 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:48:47 localhost ceph-mon[289735]: paxos.4).electionLogic(22) init, last seen epoch 22 Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:47 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:48:48 localhost systemd[1]: tmp-crun.oXlu8e.mount: Deactivated successfully. Nov 23 04:48:48 localhost podman[289776]: 2025-11-23 09:48:48.045229921 +0000 UTC m=+0.104359544 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:48:48 localhost podman[289776]: 2025-11-23 09:48:48.084372755 +0000 UTC m=+0.143502328 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:48:48 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.460 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.463 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.463 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.495 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:49 localhost nova_compute[281952]: 2025-11-23 09:48:49.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:48:52 localhost podman[289795]: 2025-11-23 09:48:52.032826933 +0000 UTC m=+0.088008694 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:48:52 localhost podman[289795]: 2025-11-23 09:48:52.039447139 +0000 UTC m=+0.094649701 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:48:52 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532585@4(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532581 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532584 calling monitor election Nov 23 04:48:52 localhost ceph-mon[289735]: mon.np0005532581 is new leader, mons np0005532581,np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4,5) Nov 23 04:48:52 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:48:52 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:53 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_auth_request failed to assign global_id Nov 23 04:48:54 localhost podman[289943]: 2025-11-23 09:48:54.163023743 +0000 UTC m=+0.101743912 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Nov 23 04:48:54 localhost podman[289943]: 2025-11-23 09:48:54.2876276 +0000 UTC m=+0.226347759 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.500 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.500 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.539 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:54 localhost nova_compute[281952]: 2025-11-23 09:48:54.539 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:55 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:56 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:48:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:48:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:48:57 localhost podman[290470]: 2025-11-23 09:48:57.878764957 +0000 UTC m=+0.098351146 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:57 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:48:57 localhost podman[290471]: 2025-11-23 09:48:57.965071586 +0000 UTC m=+0.183195959 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:48:57 localhost podman[290470]: 2025-11-23 09:48:57.972501559 +0000 UTC m=+0.192087708 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 04:48:57 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:48:58 localhost podman[290471]: 2025-11-23 09:48:58.009564067 +0000 UTC m=+0.227688440 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Nov 23 04:48:58 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:48:58 localhost ceph-mon[289735]: Reconfiguring mon.np0005532581 (monmap changed)... Nov 23 04:48:58 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532581 on np0005532581.localdomain Nov 23 04:48:58 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:58 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:58 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532581.sxlgsx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:48:59 localhost podman[290518]: 2025-11-23 09:48:59.038245924 +0000 UTC m=+0.091579984 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:48:59 localhost podman[290518]: 2025-11-23 09:48:59.043412046 +0000 UTC m=+0.096745836 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 23 04:48:59 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.540 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:48:59 localhost nova_compute[281952]: 2025-11-23 09:48:59.546 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:48:59 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532581.sxlgsx (monmap changed)... Nov 23 04:48:59 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532581.sxlgsx on np0005532581.localdomain Nov 23 04:48:59 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:59 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:48:59 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532581.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:48:59 localhost openstack_network_exporter[242668]: ERROR 09:48:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:48:59 localhost openstack_network_exporter[242668]: ERROR 09:48:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:48:59 localhost openstack_network_exporter[242668]: ERROR 09:48:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:48:59 localhost openstack_network_exporter[242668]: ERROR 09:48:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:48:59 localhost openstack_network_exporter[242668]: Nov 23 04:48:59 localhost openstack_network_exporter[242668]: ERROR 09:48:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:48:59 localhost openstack_network_exporter[242668]: Nov 23 04:49:00 localhost ceph-mon[289735]: Reconfiguring crash.np0005532581 (monmap changed)... Nov 23 04:49:00 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532581 on np0005532581.localdomain Nov 23 04:49:00 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:00 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:00 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.754 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.827 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:49:01 localhost nova_compute[281952]: 2025-11-23 09:49:01.828 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:49:01 localhost ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)... Nov 23 04:49:01 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain Nov 23 04:49:01 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:01 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:01 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.022 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11720MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.125 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.125 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.126 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.145 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.172 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.173 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.193 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.233 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.289 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.765 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.771 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.787 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.790 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:49:02 localhost nova_compute[281952]: 2025-11-23 09:49:02.791 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:49:02 localhost ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)... Nov 23 04:49:02 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain Nov 23 04:49:02 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:02 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:02 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:03 localhost nova_compute[281952]: 2025-11-23 09:49:03.792 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:03 localhost nova_compute[281952]: 2025-11-23 09:49:03.792 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:03 localhost nova_compute[281952]: 2025-11-23 09:49:03.793 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:04 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)... Nov 23 04:49:04 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain Nov 23 04:49:04 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:04 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:04 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.582 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:04 localhost nova_compute[281952]: 2025-11-23 09:49:04.583 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:04 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 23 04:49:04 localhost ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.103:0/1798289153' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 23 04:49:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:49:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:49:05 localhost systemd[1]: tmp-crun.15uqAf.mount: Deactivated successfully. Nov 23 04:49:05 localhost podman[290581]: 2025-11-23 09:49:05.029755901 +0000 UTC m=+0.087265969 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 23 04:49:05 localhost ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:49:05 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:49:05 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:05 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:05 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)... Nov 23 04:49:05 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:05 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain Nov 23 04:49:05 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:05 localhost nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:49:05 localhost nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:49:05 localhost nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:49:05 localhost nova_compute[281952]: 2025-11-23 09:49:05.038 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:49:05 localhost podman[290581]: 2025-11-23 09:49:05.045433092 +0000 UTC m=+0.102943170 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:49:05 localhost podman[290582]: 2025-11-23 09:49:05.080617852 +0000 UTC m=+0.133261718 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:49:05 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:49:05 localhost podman[290582]: 2025-11-23 09:49:05.116181554 +0000 UTC m=+0.168825510 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:49:05 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:49:06 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:06 localhost ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:49:06 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:06 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:49:06 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:06 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:06 localhost ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:49:06 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:06 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.136 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.154 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.155 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:06 localhost nova_compute[281952]: 2025-11-23 09:49:06.156 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:49:07 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:07 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' Nov 23 04:49:07 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:49:07 localhost ceph-mon[289735]: from='mgr.14120 172.18.0.103:0/1791364452' entity='mgr.np0005532581.sxlgsx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:49:07 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:49:07 localhost nova_compute[281952]: 2025-11-23 09:49:07.153 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:07 localhost nova_compute[281952]: 2025-11-23 09:49:07.154 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:07 localhost nova_compute[281952]: 2025-11-23 09:49:07.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:49:07 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 23 04:49:07 localhost ceph-mon[289735]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/443540260' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:49:07 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Nov 23 04:49:07 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Nov 23 04:49:07 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 e82: 6 total, 6 up, 6 in Nov 23 04:49:07 localhost systemd[1]: session-19.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-26.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-26.scope: Consumed 3min 24.930s CPU time. Nov 23 04:49:07 localhost systemd[1]: session-23.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd-logind[761]: Session 26 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd[1]: session-22.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-24.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd-logind[761]: Session 19 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 23 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 24 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 22 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd[1]: session-21.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-16.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-14.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-25.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-17.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-18.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd[1]: session-20.scope: Deactivated successfully. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 19. Nov 23 04:49:07 localhost systemd-logind[761]: Session 17 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 25 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 14 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 20 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 18 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 21 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Session 16 logged out. Waiting for processes to exit. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 26. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 23. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 22. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 24. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 21. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 16. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 14. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 25. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 17. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 18. Nov 23 04:49:07 localhost systemd-logind[761]: Removed session 20. Nov 23 04:49:08 localhost sshd[290624]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:49:08 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:49:08 localhost ceph-mon[289735]: Activating manager daemon np0005532583.orhywt Nov 23 04:49:08 localhost ceph-mon[289735]: from='client.? 172.18.0.103:0/443540260' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:49:08 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:49:08 localhost ceph-mon[289735]: Manager daemon np0005532583.orhywt is now available Nov 23 04:49:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/mirror_snapshot_schedule"} : dispatch Nov 23 04:49:08 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/mirror_snapshot_schedule"} : dispatch Nov 23 04:49:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/trash_purge_schedule"} : dispatch Nov 23 04:49:08 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532583.orhywt/trash_purge_schedule"} : dispatch Nov 23 04:49:08 localhost systemd-logind[761]: New session 64 of user ceph-admin. Nov 23 04:49:08 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1019743782 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:08 localhost systemd[1]: Started Session 64 of User ceph-admin. Nov 23 04:49:09 localhost podman[290737]: 2025-11-23 09:49:09.282559027 +0000 UTC m=+0.105217091 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, architecture=x86_64, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Nov 23 04:49:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:49:09.285 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:49:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:49:09.286 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:49:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:49:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:49:09 localhost podman[290737]: 2025-11-23 09:49:09.364415846 +0000 UTC m=+0.187073890 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph) Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.583 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.585 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.641 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:09 localhost nova_compute[281952]: 2025-11-23 09:49:09.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:09 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:09 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.806 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:49:10 localhost ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Bus STARTING Nov 23 04:49:10 localhost ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Serving on http://172.18.0.105:8765 Nov 23 04:49:10 localhost ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Serving on https://172.18.0.105:7150 Nov 23 04:49:10 localhost ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Bus STARTED Nov 23 04:49:10 localhost ceph-mon[289735]: [23/Nov/2025:09:49:09] ENGINE Client ('172.18.0.105', 46326) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.841 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7077b846-9a10-4bd6-a7b8-739ece95bb37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.807807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a965e2a2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '943d2b85e492b579799ca682ba37ba97005542272c2424b5b5d9de4fb7687340'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.807807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a965f85a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '73c1aca9b1835254b14a46633df8961b966de7a3246e66e91e472b2f203ca3ad'}]}, 'timestamp': '2025-11-23 09:49:10.842598', '_unique_id': '94733f6fd87d41508da374bc7b000004'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.844 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec27a55c-a2c9-44fb-a2a8-f4d991e454ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.845736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9672a54-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'c885fd54b708dd5d63fd14e05c5d08893d07e16d76d308d43aa90fab318e094e'}]}, 'timestamp': '2025-11-23 09:49:10.850479', '_unique_id': 'cd99aed6501c44bcbbeed944dff12581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.851 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.853 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa61345-223a-4bd8-970b-0d13de27e4af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:49:10.853350', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a9697cdc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.04258916, 'message_signature': '86622d310ad3a44dffdc2700c063d254a4f8109b8398d2c663c33d0e488bb05a'}]}, 'timestamp': '2025-11-23 09:49:10.865709', '_unique_id': 'b77d4a60de7b440c96c7f336edbbf159'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fa408f9-1084-4620-ade1-aabeb2c2e6fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.868469', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a969fcfc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'ce936cc1ad7e7a9a4efe34e6e96b94862440746c2fa0a392fcc5a1992ad6884f'}]}, 'timestamp': '2025-11-23 09:49:10.868993', '_unique_id': 'be3265b57fe54851a7826da31adb1608'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84a8a26a-9b78-429d-9d78-9c341b08844c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.871446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96be6b6-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '6744464af0e5959125e9544a4c60682bbc09c0ea9000a962c9fec49cf02631fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.871446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96c0196-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '762142512675724d4de1fc2ecfbc3ff30dcb3b41beeb06f8d3e1ae611658a1d5'}]}, 'timestamp': '2025-11-23 09:49:10.882170', '_unique_id': '052614046e0b4cf09959915894e8abaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.883 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1e146ba-f155-4f44-90e6-d61b3dad4384', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.885436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96c96ec-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '640a2f9a70b8ba11ee5158b39d6f149e2e3f4a7ccd7d7b0c0ac9514ab1982d3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.885436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96caf2e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '39dc2fcd56befd33f7ddfe85ed868f08fa60dd8b5c2422fd4a23eb32a483d334'}]}, 'timestamp': '2025-11-23 09:49:10.886675', '_unique_id': '5652d85864d8444a9d1bf8a06f488804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.889 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5c6f3b2-280b-418c-90c4-ea227bcdcf11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.890342', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a96d56a4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'e88353e3a4d68516d01b389b301b673ab49ed0dae8ff9ce8f61cd921e2c3549c'}]}, 'timestamp': '2025-11-23 09:49:10.891011', '_unique_id': 'e01bd84fbd3143d08df5c0d15b0d7643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '036ae7b8-66c3-431f-879c-d8606310c590', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.893749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96dd980-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'fcbfdbba2e27ad20fa86f0cfe05793b262fc5fb356c396c8799baea1d43314a9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.893749', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96de9de-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '29a5b7b1110b625adabcac4a41cc654feac692a262d8624d303f4024b94dcd8b'}]}, 'timestamp': '2025-11-23 09:49:10.894644', '_unique_id': 'cd033693def74e1bae493273778eedeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.896 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4edd0b66-e900-43ed-9ae0-4df0a6009f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.897025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96e5766-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '90a94507b1c79ebca7a15c8fe3c406876f13a8bd6f1e570aed3f929e26ba1c81'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.897025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96e6832-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': 'a0286463f6818a1a1525c46477dbed14cf4947529a17bcb86773cba77bebcdfa'}]}, 'timestamp': '2025-11-23 09:49:10.897870', '_unique_id': '45f0f5e0492344c88e1a799fba68d8d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 12890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '597c1d3b-9182-41de-ad34-e7b92c705141', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12890000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:49:10.900048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a96ecd90-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.04258916, 'message_signature': '582aa938219eed9827e8ad2b5cf155d4c9618d195fab01afefaf3086919539e3'}]}, 'timestamp': '2025-11-23 09:49:10.900481', '_unique_id': '7e7c7bdade164b3fbe9269d743988532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c951ad9d-be67-4163-8e59-4261cc7ec705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.902733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a96f3852-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '3bde47314ab4f47a251f575c31b4a1d9cd6c117df55f46371b82b061bf247d4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.902733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a96f4aa4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'b4e23aaee5a7c1200c6613a498b6d9c67c0863f5182439568ae1cec54a0b1b6d'}]}, 'timestamp': '2025-11-23 09:49:10.903687', '_unique_id': '2668aba8c7c44938bd2932224f79fdeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e926dc0-90ea-41ec-934f-4f32f18cc9c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.906177', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a96fbffc-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': '93a99b8676d1895ddadcd24894e78e3ce2b66b6df82a7f839b734ae31b32c256'}]}, 'timestamp': '2025-11-23 09:49:10.906993', '_unique_id': '848390cd3e1c411584437c3a80ef16e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07dc6da0-3017-4574-a67a-7cdd4cee7962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.909985', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a970532c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'bf7d1bc5410df5f75d7aefc0f84f82a88d614b3dadc35535d9be9e3a50833828'}]}, 'timestamp': '2025-11-23 09:49:10.910490', '_unique_id': '369fa97b08e2484dbc35a597f6b43568'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7362bf-583e-4cf2-92c2-b01035515761', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.912803', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a970c19a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'd3ce5bf718bed2d7a843978aaea4a5577443b6d4be27c22d88d81d37c8c332ee'}]}, 'timestamp': '2025-11-23 09:49:10.913305', '_unique_id': 'b0afe3211d3448f2a42bba2568b0b9ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd7da782-0998-4655-a42f-c7e817459fd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.915126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9711848-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '44bfecdf48dbd17be2dda1695081272b6ed2393de4f4469d4c0a63fb43a8066e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.915126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a97122e8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'f3d1b0f1d21cd12ae4a02a5bfb9831d7ffaa1327dfa68d9a1e4fc652430f5cd3'}]}, 'timestamp': '2025-11-23 09:49:10.915762', '_unique_id': '289b96b915c742afaebc09f9798c5f6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '548669e5-74d8-44f3-a488-6d4d4578e903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.917225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9716a00-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': 'cdb8d239cc02b221848d172fb7521a5576f1da005d877fe3c2783015b62d9c6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.917225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a97175b8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11293.985438484, 'message_signature': '17ce090e2c63ce5d97ee1dbb7ea021b7fff06a652a363444747234e91460b4a2'}]}, 'timestamp': '2025-11-23 09:49:10.917811', '_unique_id': '389bf9cb8f9543c29191e316237841c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f164214f-3829-49dd-a523-cea231417284', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:49:10.919284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a971ba3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': '700ddbc9c24ef25b36841d39e42e5c622f0babfaac9533296196b547c812edf3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:49:10.919284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a971c4c8-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.049116724, 'message_signature': 'a4b4c92c98c55d18daf739fe85e226ffbc71f994b8aa07a6bee39a81e4fdcdff'}]}, 'timestamp': '2025-11-23 09:49:10.919833', '_unique_id': '341f28fc21b74b9abf55b6410aa7a9e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01c96ddc-9cee-4e83-9679-ba9a3c1614ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.921476', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9721072-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'c6cab1972da46204453aa8bf869ae6ad457635f6b3194f92d80ae693c58067dc'}]}, 'timestamp': '2025-11-23 09:49:10.921796', '_unique_id': 'be060c2f5ecb4d78958882d23b2941bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beae3ec8-7bad-4249-97c5-69d223645acd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.923460', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a9725dc0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'a3ac68f29405c464564f4a45f946d0ed0e266c2cae644622e34ccc464c9eec03'}]}, 'timestamp': '2025-11-23 09:49:10.923782', '_unique_id': '2b54399165754d17bc2aab614b4cf812'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0a0514-54fd-41a5-ba36-aab44cff159b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.925251', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a972a3c0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': 'd47d662b7f7f29343ba043cbe5470bb6f3b14ed2e1f2f1717dc6ac71e93b96ad'}]}, 'timestamp': '2025-11-23 09:49:10.925616', '_unique_id': '4e101cb44cd54cdeb646aaaf29f1934c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c6ee195-eb72-4d6f-bc4f-0a671149939d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:49:10.927047', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'a972e9ca-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11294.02339573, 'message_signature': '6cdd137a14507bd17c4d2085331dbf7455f9d6ba6db2ce71e4f7cc448e822a9e'}]}, 'timestamp': '2025-11-23 09:49:10.927359', '_unique_id': '927f64cde45c455a986b72f3c1eadea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:49:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:49:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:49:11 localhost podman[240668]: time="2025-11-23T09:49:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:49:11 localhost podman[240668]: @ - - [23/Nov/2025:09:49:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:49:11 localhost podman[240668]: @ - - [23/Nov/2025:09:49:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18687 "" "Go-http-client/1.1" Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532581", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532581", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:49:12 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:12 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020048401 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:13 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:14 localhost nova_compute[281952]: 2025-11-23 09:49:14.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:15 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:49:15 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:49:15 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:49:16 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:16 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:16 localhost ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:49:16 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:49:17 localhost ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:49:17 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:17 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:17 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:49:17 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:17 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:17 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:49:18 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054597 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:18 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:18 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:18 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:19 localhost podman[291635]: 2025-11-23 09:49:19.03354329 +0000 UTC m=+0.084723790 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm) Nov 23 04:49:19 localhost podman[291635]: 2025-11-23 09:49:19.072443896 +0000 UTC m=+0.123624386 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 23 04:49:19 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:49:19 localhost nova_compute[281952]: 2025-11-23 09:49:19.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:20 localhost podman[291707]: Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.461160711 +0000 UTC m=+0.079803646 container create 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:49:20 localhost systemd[1]: Started libpod-conmon-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope. Nov 23 04:49:20 localhost systemd[1]: Started libcrun container. Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.429039017 +0000 UTC m=+0.047682012 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.538873781 +0000 UTC m=+0.157516716 container init 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.550002949 +0000 UTC m=+0.168645884 container start 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.550479244 +0000 UTC m=+0.169122229 container attach 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12) Nov 23 04:49:20 localhost loving_feynman[291721]: 167 167 Nov 23 04:49:20 localhost systemd[1]: libpod-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope: Deactivated successfully. Nov 23 04:49:20 localhost podman[291707]: 2025-11-23 09:49:20.554299614 +0000 UTC m=+0.172942569 container died 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=) Nov 23 04:49:20 localhost podman[291726]: 2025-11-23 09:49:20.655166998 +0000 UTC m=+0.088737446 container remove 2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_feynman, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, ceph=True, name=rhceph, GIT_CLEAN=True, release=553, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container) Nov 23 04:49:20 localhost systemd[1]: libpod-conmon-2f5b7feee7beab5b2af7dede6b41835995d4a6054695c665dbf5fc77b65a1cf9.scope: Deactivated successfully. Nov 23 04:49:20 localhost ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:49:20 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:20 localhost ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:20 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:20 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:49:21 localhost podman[291797]: Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.387878249 +0000 UTC m=+0.081394726 container create cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:49:21 localhost systemd[1]: Started libpod-conmon-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope. Nov 23 04:49:21 localhost systemd[1]: Started libcrun container. Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.451234331 +0000 UTC m=+0.144750798 container init cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, version=7, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.352168263 +0000 UTC m=+0.045684790 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.459139238 +0000 UTC m=+0.152655685 container start cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, version=7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55) Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.459471738 +0000 UTC m=+0.152988205 container attach cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., release=553, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:49:21 localhost angry_taussig[291812]: 167 167 Nov 23 04:49:21 localhost systemd[1]: libpod-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope: Deactivated successfully. Nov 23 04:49:21 localhost podman[291797]: 2025-11-23 09:49:21.464351861 +0000 UTC m=+0.157868358 container died cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Nov 23 04:49:21 localhost systemd[1]: var-lib-containers-storage-overlay-a0998737b2ad229fca58596bf13e0c669a0bf094ca288df092c5077416bf7dc9-merged.mount: Deactivated successfully. Nov 23 04:49:21 localhost systemd[1]: var-lib-containers-storage-overlay-1fc7c22e46194956e486f3caf689085d24fc60b8b4ff37693a607a99b020a1bb-merged.mount: Deactivated successfully. Nov 23 04:49:21 localhost podman[291817]: 2025-11-23 09:49:21.568978972 +0000 UTC m=+0.091320936 container remove cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_taussig, vcs-type=git, ceph=True, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55) Nov 23 04:49:21 localhost systemd[1]: libpod-conmon-cd04457046c3441ccab6b13f8cbca6d5b5b21c8660b52813b5260889e93d10ea.scope: Deactivated successfully. Nov 23 04:49:21 localhost ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:49:21 localhost ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:49:21 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:21 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:49:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:49:22 localhost podman[291892]: 2025-11-23 09:49:22.381456029 +0000 UTC m=+0.091220454 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:49:22 localhost podman[291892]: 2025-11-23 09:49:22.397292814 +0000 UTC m=+0.107057259 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:49:22 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:49:22 localhost podman[291900]: Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.445802481 +0000 UTC m=+0.135246090 container create 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:49:22 localhost systemd[1]: Started libpod-conmon-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope. Nov 23 04:49:22 localhost systemd[1]: Started libcrun container. Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.414035898 +0000 UTC m=+0.103479547 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.5222244 +0000 UTC m=+0.211668009 container init 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, maintainer=Guillaume Abrioux , GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, RELEASE=main) Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.534007138 +0000 UTC m=+0.223450747 container start 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:49:22 localhost tender_curran[291933]: 167 167 Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.537583921 +0000 UTC m=+0.227027570 container attach 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, maintainer=Guillaume Abrioux , release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:49:22 localhost systemd[1]: libpod-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope: Deactivated successfully. Nov 23 04:49:22 localhost podman[291900]: 2025-11-23 09:49:22.540824672 +0000 UTC m=+0.230268331 container died 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:49:22 localhost podman[291938]: 2025-11-23 09:49:22.647390784 +0000 UTC m=+0.095667962 container remove 921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_curran, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:49:22 localhost systemd[1]: libpod-conmon-921d994affb4c114363c26660f9c680584eeef3b90c1fb22440abd45bf6ad350.scope: Deactivated successfully. Nov 23 04:49:22 localhost ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:49:22 localhost ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:49:23 localhost ceph-mon[289735]: mon.np0005532585@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:23 localhost systemd[1]: var-lib-containers-storage-overlay-efe11e68d075679b9233858ce9f9507c795f6ea1656950c6827243b2718d7f22-merged.mount: Deactivated successfully. Nov 23 04:49:23 localhost podman[292014]: Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.518289107 +0000 UTC m=+0.080140737 container create 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, architecture=x86_64, name=rhceph, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:49:23 localhost systemd[1]: Started libpod-conmon-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope. Nov 23 04:49:23 localhost systemd[1]: Started libcrun container. Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.582664091 +0000 UTC m=+0.144515721 container init 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, ceph=True, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.486296096 +0000 UTC m=+0.048147776 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.592619231 +0000 UTC m=+0.154470841 container start 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.593039375 +0000 UTC m=+0.154891005 container attach 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Nov 23 04:49:23 localhost vigilant_greider[292029]: 167 167 Nov 23 04:49:23 localhost systemd[1]: libpod-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope: Deactivated successfully. Nov 23 04:49:23 localhost podman[292014]: 2025-11-23 09:49:23.598694922 +0000 UTC m=+0.160546532 container died 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, ceph=True, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:49:23 localhost podman[292034]: 2025-11-23 09:49:23.695384545 +0000 UTC m=+0.085948179 container remove 2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_greider, name=rhceph, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553) Nov 23 04:49:23 localhost systemd[1]: libpod-conmon-2c660c42f44bda73d4a62bbc80285c51954e12982652955dbf3474770a23f9a0.scope: Deactivated successfully. Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:23 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:23 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:23 localhost ceph-mon[289735]: from='mgr.14190 ' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:24 localhost podman[292105]: Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.32873304 +0000 UTC m=+0.047218698 container create b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, RELEASE=main) Nov 23 04:49:24 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585@4(peon) e7 my rank is now 3 (was 4) Nov 23 04:49:24 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:49:24 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:49:24 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 23 04:49:24 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:49:24 localhost ceph-mon[289735]: paxos.3).electionLogic(26) init, last seen epoch 26 Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:24 localhost systemd[1]: Started libpod-conmon-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope. Nov 23 04:49:24 localhost systemd[1]: Started libcrun container. Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.408304089 +0000 UTC m=+0.126789737 container init b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, io.buildah.version=1.33.12, release=553, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7) Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.3086004 +0000 UTC m=+0.027086019 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.414328717 +0000 UTC m=+0.132814345 container start b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, ceph=True, release=553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, version=7, RELEASE=main) Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.414531463 +0000 UTC m=+0.133017121 container attach b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, release=553) Nov 23 04:49:24 localhost sharp_driscoll[292120]: 167 167 Nov 23 04:49:24 localhost systemd[1]: libpod-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope: Deactivated successfully. Nov 23 04:49:24 localhost podman[292105]: 2025-11-23 09:49:24.416507614 +0000 UTC m=+0.134993242 container died b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, release=553, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., RELEASE=main) Nov 23 04:49:24 localhost systemd[1]: tmp-crun.MNQdaR.mount: Deactivated successfully. Nov 23 04:49:24 localhost systemd[1]: var-lib-containers-storage-overlay-4d9a124529c96da9428c378bcfe99ca14134ff53d5ff5bfc51382c1ed6944592-merged.mount: Deactivated successfully. Nov 23 04:49:24 localhost systemd[1]: var-lib-containers-storage-overlay-19f0ad4f30e6e6a8d15ebb142551279c5c74c2cad6c5118f7e7ac59710c75095-merged.mount: Deactivated successfully. Nov 23 04:49:24 localhost podman[292125]: 2025-11-23 09:49:24.496384352 +0000 UTC m=+0.068318347 container remove b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_driscoll, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7) Nov 23 04:49:24 localhost systemd[1]: libpod-conmon-b3e205525d6d43eb6b01e01a7fb6166a183b4a96103be376500c42e44b5e2bcf.scope: Deactivated successfully. Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:24 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:49:24 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:49:24 localhost ceph-mon[289735]: Remove daemons mon.np0005532581 Nov 23 04:49:24 localhost ceph-mon[289735]: Safe to remove mon.np0005532581: new quorum should be ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585', 'np0005532584'] (from ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585', 'np0005532584']) Nov 23 04:49:24 localhost ceph-mon[289735]: Removing monitor np0005532581 from monmap... Nov 23 04:49:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "mon rm", "name": "np0005532581"} : dispatch Nov 23 04:49:24 localhost ceph-mon[289735]: Removing daemon mon.np0005532581 from np0005532581.localdomain -- ports [] Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:49:24 localhost nova_compute[281952]: 2025-11-23 09:49:24.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532584 calling monitor election Nov 23 04:49:24 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4) Nov 23 04:49:24 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:49:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:25 localhost podman[292195]: Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.246746837 +0000 UTC m=+0.082045147 container create 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git) Nov 23 04:49:25 localhost systemd[1]: Started libpod-conmon-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope. Nov 23 04:49:25 localhost systemd[1]: Started libcrun container. Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.308672733 +0000 UTC m=+0.143971073 container init 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public) Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.215116808 +0000 UTC m=+0.050415208 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.316033613 +0000 UTC m=+0.151331933 container start 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, RELEASE=main, version=7, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.31623456 +0000 UTC m=+0.151532900 container attach 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:49:25 localhost magical_tesla[292210]: 167 167 Nov 23 04:49:25 localhost systemd[1]: libpod-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope: Deactivated successfully. Nov 23 04:49:25 localhost podman[292195]: 2025-11-23 09:49:25.319610605 +0000 UTC m=+0.154909025 container died 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:49:25 localhost podman[292215]: 2025-11-23 09:49:25.411347064 +0000 UTC m=+0.080945492 container remove 99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_tesla, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, ceph=True) Nov 23 04:49:25 localhost systemd[1]: libpod-conmon-99a435c996b4cf9b84fb555b9346f42e0d70447c7a09a9b74a76b22df3be928f.scope: Deactivated successfully. Nov 23 04:49:25 localhost systemd[1]: var-lib-containers-storage-overlay-f3a51811fb891e82ae0c5efa0fd91aa30419890bdf8bca7e1fbf4d62bbdedda6-merged.mount: Deactivated successfully. Nov 23 04:49:25 localhost ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:49:25 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:49:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:26 localhost ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:49:26 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:49:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:49:27 localhost ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:49:27 localhost ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:49:27 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:27 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:27 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:49:27 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:28 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:28 localhost ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:49:28 localhost ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:49:28 localhost ceph-mon[289735]: Removed label mon from host np0005532581.localdomain Nov 23 04:49:28 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:28 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:28 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:49:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:49:29 localhost podman[292232]: 2025-11-23 09:49:29.036294088 +0000 UTC m=+0.088993834 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118) Nov 23 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:49:29 localhost podman[292233]: 2025-11-23 09:49:29.12560856 +0000 UTC m=+0.178407259 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal) Nov 23 04:49:29 localhost podman[292232]: 2025-11-23 09:49:29.13837403 +0000 UTC m=+0.191073776 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:49:29 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:49:29 localhost podman[292233]: 2025-11-23 09:49:29.169513574 +0000 UTC m=+0.222312223 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container) Nov 23 04:49:29 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:49:29 localhost podman[292275]: 2025-11-23 09:49:29.223225863 +0000 UTC m=+0.086912809 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:49:29 localhost podman[292275]: 2025-11-23 09:49:29.230090188 +0000 UTC m=+0.093777154 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 04:49:29 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:49:29 localhost nova_compute[281952]: 2025-11-23 09:49:29.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:29 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:49:29 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:49:29 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:29 localhost ceph-mon[289735]: Removed label mgr from host np0005532581.localdomain Nov 23 04:49:29 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:29 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:29 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:29 localhost openstack_network_exporter[242668]: ERROR 09:49:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:49:29 localhost openstack_network_exporter[242668]: ERROR 09:49:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:49:29 localhost openstack_network_exporter[242668]: ERROR 09:49:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:49:29 localhost openstack_network_exporter[242668]: ERROR 09:49:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:49:29 localhost openstack_network_exporter[242668]: Nov 23 04:49:29 localhost openstack_network_exporter[242668]: ERROR 09:49:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:49:29 localhost openstack_network_exporter[242668]: Nov 23 04:49:30 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:49:30 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:49:30 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:30 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:30 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:30 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:31 localhost ceph-mon[289735]: Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:49:31 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:49:31 localhost ceph-mon[289735]: Removed label _admin from host np0005532581.localdomain Nov 23 04:49:31 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:31 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:32 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:32 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:32 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:49:32 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:32 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.937044) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891373937113, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11943, "num_deletes": 513, "total_data_size": 17596087, "memory_usage": 18490960, "flush_reason": "Manual Compaction"} Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 23 04:49:33 localhost ceph-mon[289735]: Removing np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:49:33 localhost ceph-mon[289735]: Removing np0005532581.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:49:33 localhost ceph-mon[289735]: Removing np0005532581.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891373995263, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12135327, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11948, "table_properties": {"data_size": 12078944, "index_size": 30101, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 263943, "raw_average_key_size": 26, "raw_value_size": 11907585, "raw_average_value_size": 1181, "num_data_blocks": 1146, "num_entries": 10082, "num_filter_entries": 10082, "num_deletions": 512, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 1763891318, "file_creation_time": 1763891373, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 23 04:49:33 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 58275 microseconds, and 27867 cpu microseconds. Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.995320) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12135327 bytes OK Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:33.995341) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001070) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001089) EVENT_LOG_v1 {"time_micros": 1763891374001084, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.001104) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 17517697, prev total WAL file size 17542408, number of live WAL files 2. Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.003097) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(1887B)] Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374003194, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12137214, "oldest_snapshot_seqno": -1} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9573 keys, 12127502 bytes, temperature: kUnknown Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374068068, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12127502, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12072447, "index_size": 30058, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 255527, "raw_average_key_size": 26, "raw_value_size": 11907643, "raw_average_value_size": 1243, "num_data_blocks": 1144, "num_entries": 9573, "num_filter_entries": 9573, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891374, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.068326) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12127502 bytes Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.069801) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.8 rd, 186.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.6, 0.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10087, records dropped: 514 output_compression: NoCompression Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.069822) EVENT_LOG_v1 {"time_micros": 1763891374069812, "job": 4, "event": "compaction_finished", "compaction_time_micros": 64966, "compaction_time_cpu_micros": 38566, "output_level": 6, "num_output_files": 1, "total_output_size": 12127502, "num_input_records": 10087, "num_output_records": 9573, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374071045, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891374071091, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 23 04:49:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:34.002991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:34 localhost sshd[292617]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.706 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:34 localhost nova_compute[281952]: 2025-11-23 09:49:34.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:34 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:34 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:34 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:34 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:34 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:49:34 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:34 localhost ceph-mon[289735]: Removing daemon mgr.np0005532581.sxlgsx from np0005532581.localdomain -- ports [9283, 8765] Nov 23 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:49:36 localhost systemd[1]: tmp-crun.A4uNnE.mount: Deactivated successfully. Nov 23 04:49:36 localhost podman[292620]: 2025-11-23 09:49:36.040261574 +0000 UTC m=+0.092641408 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:49:36 localhost podman[292620]: 2025-11-23 09:49:36.077262541 +0000 UTC m=+0.129642325 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:49:36 localhost podman[292619]: 2025-11-23 09:49:36.085065815 +0000 UTC m=+0.138384729 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:49:36 localhost podman[292619]: 2025-11-23 09:49:36.118348266 +0000 UTC m=+0.171667160 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:49:36 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:49:36 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:49:37 localhost ceph-mon[289735]: Removing key for mgr.np0005532581.sxlgsx Nov 23 04:49:37 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth rm", "entity": "mgr.np0005532581.sxlgsx"} : dispatch Nov 23 04:49:37 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005532581.sxlgsx"}]': finished Nov 23 04:49:37 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:37 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:38 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:38 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:38 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:38 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:49:38 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:38 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:39 localhost ceph-mon[289735]: Reconfiguring crash.np0005532581 (monmap changed)... Nov 23 04:49:39 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532581.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:39 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532581 on np0005532581.localdomain Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.709 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.751 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:39 localhost nova_compute[281952]: 2025-11-23 09:49:39.752 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.793841) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379793879, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 462, "num_deletes": 256, "total_data_size": 303488, "memory_usage": 313512, "flush_reason": "Manual Compaction"} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379799087, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 194587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11953, "largest_seqno": 12410, "table_properties": {"data_size": 191999, "index_size": 571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7023, "raw_average_key_size": 19, "raw_value_size": 186389, "raw_average_value_size": 514, "num_data_blocks": 26, "num_entries": 362, "num_filter_entries": 362, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891373, "oldest_key_time": 1763891373, "file_creation_time": 1763891379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 5266 microseconds, and 1025 cpu microseconds. Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.799110) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 194587 bytes OK Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.799128) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800577) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800591) EVENT_LOG_v1 {"time_micros": 1763891379800586, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.800607) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 300514, prev total WAL file size 300514, number of live WAL files 2. Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.801252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353231' seq:72057594037927935, type:22 .. '6C6F676D0033373734' seq:0, type:0; will stop at (end) Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(190KB)], [15(11MB)] Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379801324, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12322089, "oldest_snapshot_seqno": -1} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9401 keys, 12213836 bytes, temperature: kUnknown Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379886734, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12213836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12159096, "index_size": 30127, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 253102, "raw_average_key_size": 26, "raw_value_size": 11996589, "raw_average_value_size": 1276, "num_data_blocks": 1145, "num_entries": 9401, "num_filter_entries": 9401, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.887103) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12213836 bytes Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.888855) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.1 rd, 142.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 11.6 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(126.1) write-amplify(62.8) OK, records in: 9935, records dropped: 534 output_compression: NoCompression Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.888884) EVENT_LOG_v1 {"time_micros": 1763891379888871, "job": 6, "event": "compaction_finished", "compaction_time_micros": 85522, "compaction_time_cpu_micros": 36448, "output_level": 6, "num_output_files": 1, "total_output_size": 12213836, "num_input_records": 9935, "num_output_records": 9401, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379889068, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891379890613, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.801113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890711) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:49:39.890718) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:49:40 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:40 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:40 localhost ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)... Nov 23 04:49:40 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:40 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain Nov 23 04:49:40 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:40 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:41 localhost ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)... Nov 23 04:49:41 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:41 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain Nov 23 04:49:41 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:41 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:41 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:41 localhost podman[240668]: time="2025-11-23T09:49:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:49:41 localhost podman[240668]: @ - - [23/Nov/2025:09:49:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:49:41 localhost podman[240668]: @ - - [23/Nov/2025:09:49:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18689 "" "Go-http-client/1.1" Nov 23 04:49:42 localhost sshd[292697]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:49:42 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)... Nov 23 04:49:42 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain Nov 23 04:49:42 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:42 localhost ceph-mon[289735]: Added label _no_schedule to host np0005532581.localdomain Nov 23 04:49:42 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:42 localhost ceph-mon[289735]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532581.localdomain Nov 23 04:49:42 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:42 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:42 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:43 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:43 localhost ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:49:43 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:49:43 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:43 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:43 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:43 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:44 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)... Nov 23 04:49:44 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain"} : dispatch Nov 23 04:49:44 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain"}]': finished Nov 23 04:49:44 localhost nova_compute[281952]: 2025-11-23 09:49:44.752 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:45 localhost ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:49:45 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:49:45 localhost ceph-mon[289735]: Removed host np0005532581.localdomain Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:45 localhost ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:45 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:45 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:49:46 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:49:46 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:49:46 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:46 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:46 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:49:47 localhost ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:49:47 localhost ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:49:47 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:47 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:47 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:48 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:48 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:49:48 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:49:48 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:48 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:48 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.754 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.757 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:49 localhost nova_compute[281952]: 2025-11-23 09:49:49.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:49 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:49:49 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:49:49 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:49 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:49 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:49:50 localhost podman[292699]: 2025-11-23 09:49:50.028062884 +0000 UTC m=+0.086102584 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:49:50 localhost podman[292699]: 2025-11-23 09:49:50.067355943 +0000 UTC m=+0.125395593 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3) Nov 23 04:49:50 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:49:50 localhost ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:49:50 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:49:50 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:50 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:50 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:49:51 localhost podman[292772]: Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.053586052 +0000 UTC m=+0.080574471 container create 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, version=7, vcs-type=git, name=rhceph, release=553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:49:51 localhost systemd[1]: Started libpod-conmon-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope. Nov 23 04:49:51 localhost systemd[1]: Started libcrun container. Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.02187244 +0000 UTC m=+0.048860849 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.132628203 +0000 UTC m=+0.159616622 container init 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.143216814 +0000 UTC m=+0.170205223 container start 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, vcs-type=git, name=rhceph, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.143483783 +0000 UTC m=+0.170472192 container attach 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55) Nov 23 04:49:51 localhost frosty_babbage[292787]: 167 167 Nov 23 04:49:51 localhost systemd[1]: libpod-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope: Deactivated successfully. Nov 23 04:49:51 localhost podman[292772]: 2025-11-23 09:49:51.148981505 +0000 UTC m=+0.175969984 container died 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main) Nov 23 04:49:51 localhost podman[292792]: 2025-11-23 09:49:51.249465887 +0000 UTC m=+0.088362285 container remove 7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_babbage, GIT_BRANCH=main, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:49:51 localhost systemd[1]: libpod-conmon-7b040062c13b0936f9b0df854a532f1a563b7f53afdc3c4f1bbb798eb1b099ee.scope: Deactivated successfully. Nov 23 04:49:51 localhost ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:49:51 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:49:51 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:51 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:51 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:49:51 localhost podman[292862]: Nov 23 04:49:51 localhost podman[292862]: 2025-11-23 09:49:51.985268076 +0000 UTC m=+0.084092811 container create 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=) Nov 23 04:49:52 localhost systemd[1]: Started libpod-conmon-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope. Nov 23 04:49:52 localhost systemd[1]: Started libcrun container. Nov 23 04:49:52 localhost podman[292862]: 2025-11-23 09:49:52.048252915 +0000 UTC m=+0.147077660 container init 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, version=7, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=) Nov 23 04:49:52 localhost podman[292862]: 2025-11-23 09:49:51.954426751 +0000 UTC m=+0.053251546 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:52 localhost podman[292862]: 2025-11-23 09:49:52.059354062 +0000 UTC m=+0.158178787 container start 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, version=7, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, architecture=x86_64) Nov 23 04:49:52 localhost podman[292862]: 2025-11-23 09:49:52.059986082 +0000 UTC m=+0.158810847 container attach 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, build-date=2025-09-24T08:57:55) Nov 23 04:49:52 localhost trusting_chatterjee[292878]: 167 167 Nov 23 04:49:52 localhost podman[292862]: 2025-11-23 09:49:52.063291585 +0000 UTC m=+0.162116320 container died 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, release=553, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:49:52 localhost systemd[1]: var-lib-containers-storage-overlay-11f724b00811be9a98ca4dfeb1b38b223dc8f6265d4b293a3cb4945f5db9990e-merged.mount: Deactivated successfully. Nov 23 04:49:52 localhost systemd[1]: libpod-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope: Deactivated successfully. Nov 23 04:49:52 localhost systemd[1]: var-lib-containers-storage-overlay-1f64b933e49d02a720f8facc951ee760d2a42e677556b3718415408ab9b3ce83-merged.mount: Deactivated successfully. Nov 23 04:49:52 localhost podman[292883]: 2025-11-23 09:49:52.153671262 +0000 UTC m=+0.082380418 container remove 4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_chatterjee, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, distribution-scope=public, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12) Nov 23 04:49:52 localhost systemd[1]: libpod-conmon-4f37382690a8e468fa0afe9cb87c8f6066be7d55b511aa0bf5b41a1bf50e60f8.scope: Deactivated successfully. Nov 23 04:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:49:52 localhost podman[292925]: 2025-11-23 09:49:52.594120034 +0000 UTC m=+0.088176869 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:49:52 localhost podman[292925]: 2025-11-23 09:49:52.609296608 +0000 UTC m=+0.103353443 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:49:52 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:49:52 localhost ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:49:52 localhost ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:49:52 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:52 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:52 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:49:52 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:52 localhost podman[292982]: Nov 23 04:49:52 localhost podman[292982]: 2025-11-23 09:49:52.974772187 +0000 UTC m=+0.081836920 container create d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, name=rhceph, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Nov 23 04:49:53 localhost systemd[1]: Started libpod-conmon-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope. Nov 23 04:49:53 localhost systemd[1]: Started libcrun container. Nov 23 04:49:53 localhost podman[292982]: 2025-11-23 09:49:53.036776936 +0000 UTC m=+0.143841669 container init d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, distribution-scope=public, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:49:53 localhost podman[292982]: 2025-11-23 09:49:52.940049431 +0000 UTC m=+0.047114224 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:53 localhost stupefied_vaughan[292997]: 167 167 Nov 23 04:49:53 localhost podman[292982]: 2025-11-23 09:49:53.04744629 +0000 UTC m=+0.154511033 container start d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=) Nov 23 04:49:53 localhost podman[292982]: 2025-11-23 09:49:53.047873873 +0000 UTC m=+0.154938606 container attach d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:49:53 localhost systemd[1]: libpod-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope: Deactivated successfully. Nov 23 04:49:53 localhost podman[292982]: 2025-11-23 09:49:53.05320574 +0000 UTC m=+0.160270503 container died d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, version=7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, name=rhceph) Nov 23 04:49:53 localhost systemd[1]: var-lib-containers-storage-overlay-d02748f416b725da51fb02cec4b710af258b4841a547b29089a2777032038f6c-merged.mount: Deactivated successfully. Nov 23 04:49:53 localhost podman[293002]: 2025-11-23 09:49:53.137289749 +0000 UTC m=+0.075888484 container remove d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_vaughan, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main) Nov 23 04:49:53 localhost systemd[1]: libpod-conmon-d6acf7fb14eddbeb9f322ac2acd0d6052fd064c41ea6e370f0f69bdce452200e.scope: Deactivated successfully. Nov 23 04:49:53 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:49:53 localhost podman[293080]: Nov 23 04:49:53 localhost podman[293080]: 2025-11-23 09:49:53.931815073 +0000 UTC m=+0.077951337 container create eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, GIT_CLEAN=True, release=553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph) Nov 23 04:49:53 localhost ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:49:53 localhost ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:49:53 localhost ceph-mon[289735]: Saving service mon spec with placement label:mon Nov 23 04:49:53 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:53 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:53 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:49:53 localhost systemd[1]: Started libpod-conmon-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope. Nov 23 04:49:53 localhost systemd[1]: Started libcrun container. Nov 23 04:49:53 localhost podman[293080]: 2025-11-23 09:49:53.989609711 +0000 UTC m=+0.135745985 container init eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:49:53 localhost podman[293080]: 2025-11-23 09:49:53.995515666 +0000 UTC m=+0.141651960 container start eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux ) Nov 23 04:49:53 localhost podman[293080]: 2025-11-23 09:49:53.995704311 +0000 UTC m=+0.141840615 container attach eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, version=7, vcs-type=git) Nov 23 04:49:53 localhost jovial_jennings[293095]: 167 167 Nov 23 04:49:53 localhost systemd[1]: libpod-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope: Deactivated successfully. Nov 23 04:49:53 localhost podman[293080]: 2025-11-23 09:49:53.998685735 +0000 UTC m=+0.144822029 container died eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:49:54 localhost podman[293080]: 2025-11-23 09:49:53.903764956 +0000 UTC m=+0.049901270 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:54 localhost systemd[1]: var-lib-containers-storage-overlay-f72e3ee1f1fa62996a99db9cdf51b9dd3291820925581936cff3c2d1b050f676-merged.mount: Deactivated successfully. Nov 23 04:49:54 localhost podman[293100]: 2025-11-23 09:49:54.072047248 +0000 UTC m=+0.067203453 container remove eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jennings, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:49:54 localhost systemd[1]: libpod-conmon-eb4c614399a5201b2c376d71326cc8cf35364fe27990c6ad7d97e6a8077a341c.scope: Deactivated successfully. Nov 23 04:49:54 localhost podman[293169]: Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.695414981 +0000 UTC m=+0.056063934 container create 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, distribution-scope=public) Nov 23 04:49:54 localhost systemd[1]: Started libpod-conmon-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope. Nov 23 04:49:54 localhost systemd[1]: Started libcrun container. Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.766565496 +0000 UTC m=+0.127214499 container init 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.669604355 +0000 UTC m=+0.030253318 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.773088141 +0000 UTC m=+0.133737094 container start 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux ) Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.77340455 +0000 UTC m=+0.134053533 container attach 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Nov 23 04:49:54 localhost exciting_lamarr[293185]: 167 167 Nov 23 04:49:54 localhost systemd[1]: libpod-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope: Deactivated successfully. Nov 23 04:49:54 localhost podman[293169]: 2025-11-23 09:49:54.776780816 +0000 UTC m=+0.137429769 container died 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.797 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.829 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:54 localhost nova_compute[281952]: 2025-11-23 09:49:54.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:54 localhost podman[293190]: 2025-11-23 09:49:54.866513662 +0000 UTC m=+0.078484566 container remove 08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_lamarr, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, ceph=True, com.redhat.component=rhceph-container, version=7, architecture=x86_64, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, CEPH_POINT_RELEASE=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:49:54 localhost systemd[1]: libpod-conmon-08a48a242c7d7dd52c9ba936ab9c08cdc46a937cce7e04659f9cf23a1430e796.scope: Deactivated successfully. Nov 23 04:49:54 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:49:54 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:49:54 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:54 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:54 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:49:54 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:49:55 localhost systemd[1]: var-lib-containers-storage-overlay-7ef813d07dc3ea023aa859f645b88f97377b354847b9b9f0c4ccee6c4745cc67-merged.mount: Deactivated successfully. Nov 23 04:49:55 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 23 04:49:55 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:49:55 localhost ceph-mon[289735]: paxos.3).electionLogic(28) init, last seen epoch 28 Nov 23 04:49:55 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:55 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.833 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.834 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.834 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:49:59 localhost nova_compute[281952]: 2025-11-23 09:49:59.838 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:49:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:49:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:49:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:49:59 localhost openstack_network_exporter[242668]: ERROR 09:49:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:49:59 localhost openstack_network_exporter[242668]: ERROR 09:49:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:49:59 localhost openstack_network_exporter[242668]: ERROR 09:49:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:49:59 localhost openstack_network_exporter[242668]: ERROR 09:49:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:49:59 localhost openstack_network_exporter[242668]: Nov 23 04:49:59 localhost openstack_network_exporter[242668]: ERROR 09:49:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:49:59 localhost openstack_network_exporter[242668]: Nov 23 04:50:00 localhost podman[293224]: 2025-11-23 09:50:00.049906167 +0000 UTC m=+0.098614674 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 23 04:50:00 localhost podman[293224]: 2025-11-23 09:50:00.084894152 +0000 UTC m=+0.133602659 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 23 04:50:00 localhost podman[293226]: 2025-11-23 09:50:00.097097943 +0000 UTC m=+0.144129987 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.) Nov 23 04:50:00 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:50:00 localhost podman[293226]: 2025-11-23 09:50:00.107305623 +0000 UTC m=+0.154337647 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm) Nov 23 04:50:00 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:00 localhost podman[293225]: 2025-11-23 09:50:00.155410497 +0000 UTC m=+0.204360561 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:00 localhost podman[293225]: 2025-11-23 09:50:00.186685765 +0000 UTC m=+0.235635899 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:50:00 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:50:00 localhost ceph-mon[289735]: Remove daemons mon.np0005532584 Nov 23 04:50:00 localhost ceph-mon[289735]: Safe to remove mon.np0005532584: new quorum should be ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585'] (from ['np0005532583', 'np0005532582', 'np0005532586', 'np0005532585']) Nov 23 04:50:00 localhost ceph-mon[289735]: Removing monitor np0005532584 from monmap... Nov 23 04:50:00 localhost ceph-mon[289735]: Removing daemon mon.np0005532584 from np0005532584.localdomain -- ports [] Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:50:00 localhost ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)... Nov 23 04:50:00 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532586,np0005532585 in quorum (ranks 0,2,3) Nov 23 04:50:00 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:50:00 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585 in quorum (ranks 0,1,2,3) Nov 23 04:50:00 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:50:00 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:01 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain Nov 23 04:50:01 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:01 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:01 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:02 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)... Nov 23 04:50:02 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain Nov 23 04:50:02 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:02 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:02 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:03 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:50:03 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)... Nov 23 04:50:03 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain Nov 23 04:50:03 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:03 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:03 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:03 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:50:03 localhost ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4224794661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.681 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.758 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.759 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.953 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.953 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11740MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.954 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:50:03 localhost nova_compute[281952]: 2025-11-23 09:50:03.954 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.009 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.010 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.010 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.047 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:50:04 localhost ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:50:04 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:50:04 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:04 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:04 localhost ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:50:04 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:04 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:50:04 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:50:04 localhost ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3085114768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.460 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.466 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.492 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.494 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.495 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.870 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.872 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.874 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:04 localhost nova_compute[281952]: 2025-11-23 09:50:04.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:05 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:05 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:05 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:50:05 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:50:05 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:50:05 localhost nova_compute[281952]: 2025-11-23 09:50:05.495 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:05 localhost nova_compute[281952]: 2025-11-23 09:50:05.495 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:50:05 localhost nova_compute[281952]: 2025-11-23 09:50:05.496 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.038 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.039 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.039 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:50:06 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:06 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:06 localhost ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:50:06 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:50:06 localhost ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.543 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.559 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.559 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.560 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.561 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:06 localhost nova_compute[281952]: 2025-11-23 09:50:06.561 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:50:07 localhost podman[293332]: 2025-11-23 09:50:07.010370605 +0000 UTC m=+0.069055641 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:50:07 localhost podman[293332]: 2025-11-23 09:50:07.04634608 +0000 UTC m=+0.105031116 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 23 04:50:07 localhost podman[293333]: 2025-11-23 09:50:07.083243474 +0000 UTC m=+0.135015003 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:50:07 localhost podman[293333]: 2025-11-23 09:50:07.088609562 +0000 UTC m=+0.140381051 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:50:07 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:50:07 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:50:07 localhost nova_compute[281952]: 2025-11-23 09:50:07.275 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:07 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:07 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:07 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:08 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:08 localhost nova_compute[281952]: 2025-11-23 09:50:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:50:08 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:50:08 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:50:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:50:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:08 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:09 localhost podman[293427]: Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.267349211 +0000 UTC m=+0.077862496 container create a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, release=553, version=7, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git) Nov 23 04:50:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:50:09.286 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:50:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:50:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:50:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:50:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:50:09 localhost systemd[1]: Started libpod-conmon-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope. Nov 23 04:50:09 localhost systemd[1]: Started libcrun container. Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.233246264 +0000 UTC m=+0.043759539 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.343532323 +0000 UTC m=+0.154045558 container init a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.355138136 +0000 UTC m=+0.165651371 container start a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.355538619 +0000 UTC m=+0.166051894 container attach a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public) Nov 23 04:50:09 localhost modest_kare[293442]: 167 167 Nov 23 04:50:09 localhost systemd[1]: libpod-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope: Deactivated successfully. Nov 23 04:50:09 localhost podman[293427]: 2025-11-23 09:50:09.359586016 +0000 UTC m=+0.170099301 container died a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, name=rhceph, version=7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main) Nov 23 04:50:09 localhost podman[293447]: 2025-11-23 09:50:09.451277462 +0000 UTC m=+0.081944493 container remove a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kare, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, architecture=x86_64, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=) Nov 23 04:50:09 localhost systemd[1]: libpod-conmon-a3057d3c74247e6f582f8f55308c700d52600c802b10ffd77a58ed0388d9fb7b.scope: Deactivated successfully. Nov 23 04:50:09 localhost ceph-mon[289735]: Deploying daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:50:09 localhost ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:50:09 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:50:09 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:09 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:09 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.880 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:09 localhost nova_compute[281952]: 2025-11-23 09:50:09.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:10 localhost podman[293515]: Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.15922551 +0000 UTC m=+0.075909735 container create 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.buildah.version=1.33.12, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:50:10 localhost systemd[1]: Started libpod-conmon-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope. Nov 23 04:50:10 localhost systemd[1]: Started libcrun container. Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.221124946 +0000 UTC m=+0.137809141 container init 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.126977341 +0000 UTC m=+0.043661586 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.229055294 +0000 UTC m=+0.145739489 container start 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux ) Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.229358483 +0000 UTC m=+0.146042678 container attach 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph) Nov 23 04:50:10 localhost mystifying_volhard[293530]: 167 167 Nov 23 04:50:10 localhost systemd[1]: libpod-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope: Deactivated successfully. Nov 23 04:50:10 localhost podman[293515]: 2025-11-23 09:50:10.231556932 +0000 UTC m=+0.148241167 container died 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, release=553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:50:10 localhost systemd[1]: var-lib-containers-storage-overlay-06e50dcc3c8545d2341a09a6b04afb1395f1433750eac236ba0367b00b232f9a-merged.mount: Deactivated successfully. Nov 23 04:50:10 localhost systemd[1]: var-lib-containers-storage-overlay-b0e94a1798b46e1cee30b4fdbae8b573e0cc76171b7c9530d7aeacc5cd17ece5-merged.mount: Deactivated successfully. Nov 23 04:50:10 localhost podman[293535]: 2025-11-23 09:50:10.319338737 +0000 UTC m=+0.079013852 container remove 673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_volhard, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=553, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container) Nov 23 04:50:10 localhost systemd[1]: libpod-conmon-673634d4ab7c1d6d4b70926a2445165d5399cb2bdd00072e108ccfde2b0b0f01.scope: Deactivated successfully. Nov 23 04:50:10 localhost ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:50:10 localhost ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:50:10 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:10 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:10 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:50:11 localhost podman[293609]: Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.149732454 +0000 UTC m=+0.066877173 container create 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, GIT_CLEAN=True) Nov 23 04:50:11 localhost systemd[1]: Started libpod-conmon-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope. Nov 23 04:50:11 localhost systemd[1]: Started libcrun container. Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.208489781 +0000 UTC m=+0.125634520 container init 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553) Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.117854146 +0000 UTC m=+0.034998905 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.218299507 +0000 UTC m=+0.135444246 container start 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=) Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.218577836 +0000 UTC m=+0.135722595 container attach 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:50:11 localhost busy_jennings[293624]: 167 167 Nov 23 04:50:11 localhost systemd[1]: libpod-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope: Deactivated successfully. Nov 23 04:50:11 localhost podman[293609]: 2025-11-23 09:50:11.221119485 +0000 UTC m=+0.138264254 container died 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public) Nov 23 04:50:11 localhost systemd[1]: tmp-crun.SzEMlR.mount: Deactivated successfully. Nov 23 04:50:11 localhost systemd[1]: var-lib-containers-storage-overlay-0ef2e47773231d596c7cbcce8ce45a7a6cac7a559ab80c6d1847d2aa2e66e9fb-merged.mount: Deactivated successfully. Nov 23 04:50:11 localhost podman[293629]: 2025-11-23 09:50:11.318592484 +0000 UTC m=+0.086007721 container remove 5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:50:11 localhost systemd[1]: libpod-conmon-5da7ecbf42906adc67aa3fa7ff913f209365ca504709ed98b9cd6fac7d1f176b.scope: Deactivated successfully. Nov 23 04:50:11 localhost ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:50:11 localhost ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:50:11 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:11 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:11 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:11 localhost podman[240668]: time="2025-11-23T09:50:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:50:11 localhost podman[240668]: @ - - [23/Nov/2025:09:50:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:50:11 localhost podman[240668]: @ - - [23/Nov/2025:09:50:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1" Nov 23 04:50:12 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 23 04:50:12 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 23 04:50:12 localhost podman[293703]: Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.122530193 +0000 UTC m=+0.064303032 container create 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7) Nov 23 04:50:12 localhost systemd[1]: Started libpod-conmon-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope. Nov 23 04:50:12 localhost systemd[1]: Started libcrun container. Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.182521048 +0000 UTC m=+0.124293887 container init 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.openshift.expose-services=, build-date=2025-09-24T08:57:55) Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.191434927 +0000 UTC m=+0.133207726 container start 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.191648405 +0000 UTC m=+0.133421284 container attach 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, release=553, CEPH_POINT_RELEASE=) Nov 23 04:50:12 localhost dreamy_murdock[293718]: 167 167 Nov 23 04:50:12 localhost systemd[1]: libpod-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope: Deactivated successfully. Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.194942567 +0000 UTC m=+0.136715416 container died 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7) Nov 23 04:50:12 localhost podman[293703]: 2025-11-23 09:50:12.105142809 +0000 UTC m=+0.046915638 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:12 localhost podman[293723]: 2025-11-23 09:50:12.264863994 +0000 UTC m=+0.064472017 container remove 5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_murdock, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, version=7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:50:12 localhost systemd[1]: libpod-conmon-5c8da07e906b59e89bedf91bafcd98121b5544212cd5c7f6426be3169b1f23b1.scope: Deactivated successfully. Nov 23 04:50:12 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e8 adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints Nov 23 04:50:12 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 23 04:50:12 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:50:12 localhost ceph-mon[289735]: paxos.3).electionLogic(34) init, last seen epoch 34 Nov 23 04:50:12 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:12 localhost podman[293792]: Nov 23 04:50:12 localhost podman[293792]: 2025-11-23 09:50:12.932595844 +0000 UTC m=+0.081854321 container create d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, release=553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, RELEASE=main, vcs-type=git) Nov 23 04:50:12 localhost systemd[1]: Started libpod-conmon-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope. Nov 23 04:50:12 localhost systemd[1]: Started libcrun container. Nov 23 04:50:12 localhost podman[293792]: 2025-11-23 09:50:12.900507301 +0000 UTC m=+0.049765788 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:13 localhost podman[293792]: 2025-11-23 09:50:13.007884658 +0000 UTC m=+0.157143135 container init d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553) Nov 23 04:50:13 localhost podman[293792]: 2025-11-23 09:50:13.018016284 +0000 UTC m=+0.167274761 container start d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:50:13 localhost podman[293792]: 2025-11-23 09:50:13.018283084 +0000 UTC m=+0.167541611 container attach d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, release=553, vendor=Red Hat, Inc., distribution-scope=public, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main) Nov 23 04:50:13 localhost loving_noether[293807]: 167 167 Nov 23 04:50:13 localhost systemd[1]: libpod-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope: Deactivated successfully. Nov 23 04:50:13 localhost podman[293792]: 2025-11-23 09:50:13.020783271 +0000 UTC m=+0.170041808 container died d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, architecture=x86_64, vcs-type=git, release=553, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main) Nov 23 04:50:13 localhost podman[293812]: 2025-11-23 09:50:13.114609535 +0000 UTC m=+0.081441877 container remove d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_noether, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=) Nov 23 04:50:13 localhost systemd[1]: libpod-conmon-d1c77cc49a37712834240caaed693d67e00eb3ea2ab8eb632c2ac734d8011a65.scope: Deactivated successfully. Nov 23 04:50:13 localhost systemd[1]: var-lib-containers-storage-overlay-3115e936c8354cf2c48908567ba22294542a7e35221659601213687c3198cf86-merged.mount: Deactivated successfully. Nov 23 04:50:14 localhost nova_compute[281952]: 2025-11-23 09:50:14.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:15 localhost sshd[293828]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:50:17 localhost ceph-mds[287052]: mds.beacon.mds.np0005532585.jcltnl missed beacon ack from the monitors Nov 23 04:50:17 localhost ceph-mon[289735]: paxos.3).electionLogic(35) init, last seen epoch 35, mid-election, bumping Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585@3(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:50:17 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:50:17 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532584 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586 in quorum (ranks 0,1,2) Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532582 calling monitor election Nov 23 04:50:17 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532582,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3,4) Nov 23 04:50:17 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:50:17 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:17 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:17 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:18 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:18 localhost ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:50:18 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:50:19 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:19 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:19 localhost ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:50:19 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:50:19 localhost ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:50:19 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:19 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.922 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.951 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:19 localhost nova_compute[281952]: 2025-11-23 09:50:19.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:20 localhost ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:50:20 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:50:20 localhost ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:50:21 localhost podman[293830]: 2025-11-23 09:50:21.027361977 +0000 UTC m=+0.081318073 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:50:21 localhost podman[293830]: 2025-11-23 09:50:21.037955818 +0000 UTC m=+0.091911924 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:50:21 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:21 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:21 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:21 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:50:22 localhost podman[293866]: 2025-11-23 09:50:22.781355455 +0000 UTC m=+0.080316623 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:50:22 localhost podman[293866]: 2025-11-23 09:50:22.791551373 +0000 UTC m=+0.090512511 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:50:22 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:50:22 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:50:22 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:50:22 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:22 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:23 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:24 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:24 localhost nova_compute[281952]: 2025-11-23 09:50:24.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:25 localhost ceph-mon[289735]: Reconfig service osd.default_drive_group Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:25 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 23 04:50:26 localhost ceph-mon[289735]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3357125401' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 e83: 6 total, 6 up, 6 in Nov 23 04:50:26 localhost systemd[1]: session-64.scope: Deactivated successfully. Nov 23 04:50:26 localhost systemd[1]: session-64.scope: Consumed 21.368s CPU time. Nov 23 04:50:26 localhost systemd-logind[761]: Session 64 logged out. Waiting for processes to exit. Nov 23 04:50:26 localhost systemd-logind[761]: Removed session 64. Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: from='client.? 172.18.0.200:0/3357125401' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: Activating manager daemon np0005532582.gilwrz Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14190 172.18.0.105:0/1209981642' entity='mgr.np0005532583.orhywt' Nov 23 04:50:26 localhost ceph-mon[289735]: Manager daemon np0005532582.gilwrz is now available Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"}]': finished Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"} : dispatch Nov 23 04:50:26 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532581.localdomain.devices.0"}]': finished Nov 23 04:50:27 localhost sshd[294258]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:50:27 localhost systemd-logind[761]: New session 65 of user ceph-admin. Nov 23 04:50:27 localhost systemd[1]: Started Session 65 of User ceph-admin. Nov 23 04:50:27 localhost ceph-mon[289735]: removing stray HostCache host record np0005532581.localdomain.devices.0 Nov 23 04:50:27 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/mirror_snapshot_schedule"} : dispatch Nov 23 04:50:27 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/mirror_snapshot_schedule"} : dispatch Nov 23 04:50:27 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/trash_purge_schedule"} : dispatch Nov 23 04:50:27 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532582.gilwrz/trash_purge_schedule"} : dispatch Nov 23 04:50:28 localhost podman[294373]: 2025-11-23 09:50:28.106686148 +0000 UTC m=+0.065670245 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, build-date=2025-09-24T08:57:55, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64) Nov 23 04:50:28 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:28 localhost podman[294373]: 2025-11-23 09:50:28.206595342 +0000 UTC m=+0.165579449 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True) Nov 23 04:50:28 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:28 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:28 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:28 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:28 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost openstack_network_exporter[242668]: ERROR 09:50:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:50:29 localhost openstack_network_exporter[242668]: ERROR 09:50:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:50:29 localhost openstack_network_exporter[242668]: ERROR 09:50:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:50:29 localhost openstack_network_exporter[242668]: ERROR 09:50:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:50:29 localhost openstack_network_exporter[242668]: Nov 23 04:50:29 localhost openstack_network_exporter[242668]: ERROR 09:50:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:50:29 localhost openstack_network_exporter[242668]: Nov 23 04:50:29 localhost ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Bus STARTING Nov 23 04:50:29 localhost ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Serving on https://172.18.0.104:7150 Nov 23 04:50:29 localhost ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Client ('172.18.0.104', 33588) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:50:29 localhost ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Serving on http://172.18.0.104:8765 Nov 23 04:50:29 localhost ceph-mon[289735]: [23/Nov/2025:09:50:28] ENGINE Bus STARTED Nov 23 04:50:29 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:29 localhost nova_compute[281952]: 2025-11-23 09:50:29.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:50:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:50:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:50:30 localhost podman[294637]: 2025-11-23 09:50:30.902612077 +0000 UTC m=+0.100621638 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:50:30 localhost podman[294638]: 2025-11-23 09:50:30.958980599 +0000 UTC m=+0.156539726 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:50:30 localhost podman[294637]: 2025-11-23 09:50:30.992337263 +0000 UTC m=+0.190346834 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller) Nov 23 04:50:31 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:50:31 localhost systemd[1]: tmp-crun.iGezYM.mount: Deactivated successfully. Nov 23 04:50:31 localhost podman[294639]: 2025-11-23 09:50:31.01912489 +0000 UTC m=+0.213100045 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 23 04:50:31 localhost podman[294638]: 2025-11-23 09:50:31.043649357 +0000 UTC m=+0.241208514 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent) Nov 23 04:50:31 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:50:31 localhost podman[294639]: 2025-11-23 09:50:31.065400207 +0000 UTC m=+0.259375372 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Nov 23 04:50:31 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:50:31 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:50:31 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:50:31 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:50:31 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:31 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:31 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:31 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:31 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:32 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:33 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:33 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:33 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:34 localhost ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)... Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:34 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:34 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.871676) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434871720, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2648, "num_deletes": 255, "total_data_size": 8169082, "memory_usage": 8738464, "flush_reason": "Manual Compaction"} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434898863, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4911183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12415, "largest_seqno": 15058, "table_properties": {"data_size": 4900044, "index_size": 6876, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 28832, "raw_average_key_size": 22, "raw_value_size": 4875651, "raw_average_value_size": 3854, "num_data_blocks": 297, "num_entries": 1265, "num_filter_entries": 1265, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891379, "oldest_key_time": 1763891379, "file_creation_time": 1763891434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 27267 microseconds, and 6918 cpu microseconds. Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.898940) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4911183 bytes OK Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.898965) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900446) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900462) EVENT_LOG_v1 {"time_micros": 1763891434900457, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.900481) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8156195, prev total WAL file size 8156195, number of live WAL files 2. Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.901612) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4796KB)], [18(11MB)] Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434901641, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17125019, "oldest_snapshot_seqno": -1} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10114 keys, 15126695 bytes, temperature: kUnknown Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434975365, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 15126695, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15066385, "index_size": 33905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 270406, "raw_average_key_size": 26, "raw_value_size": 14890600, "raw_average_value_size": 1472, "num_data_blocks": 1306, "num_entries": 10114, "num_filter_entries": 10114, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891434, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.975682) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 15126695 bytes Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.981044) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 231.9 rd, 204.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 11.6 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(6.6) write-amplify(3.1) OK, records in: 10666, records dropped: 552 output_compression: NoCompression Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.981079) EVENT_LOG_v1 {"time_micros": 1763891434981064, "job": 8, "event": "compaction_finished", "compaction_time_micros": 73838, "compaction_time_cpu_micros": 21403, "output_level": 6, "num_output_files": 1, "total_output_size": 15126695, "num_input_records": 10666, "num_output_records": 10114, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434981928, "job": 8, "event": "table_file_deletion", "file_number": 20} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891434983725, "job": 8, "event": "table_file_deletion", "file_number": 18} Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.901541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.983788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:34.984088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:34 localhost nova_compute[281952]: 2025-11-23 09:50:34.996 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:34 localhost nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:34 localhost nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:34 localhost nova_compute[281952]: 2025-11-23 09:50:34.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:35 localhost nova_compute[281952]: 2025-11-23 09:50:35.025 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:35 localhost nova_compute[281952]: 2025-11-23 09:50:35.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:35 localhost ceph-mon[289735]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 23 04:50:35 localhost ceph-mon[289735]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 23 04:50:35 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:35 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:35 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532582.gilwrz (monmap changed)... Nov 23 04:50:35 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:35 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532582.gilwrz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:35 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532582.gilwrz on np0005532582.localdomain Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:36 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)... Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:36 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:36 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:50:38 localhost ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:50:38 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:50:38 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:38 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:38 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:38 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:38 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:38 localhost podman[295339]: 2025-11-23 09:50:38.037090927 +0000 UTC m=+0.085130567 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:50:38 localhost podman[295339]: 2025-11-23 09:50:38.050247583 +0000 UTC m=+0.098287233 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:50:38 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:50:38 localhost podman[295338]: 2025-11-23 09:50:38.016311956 +0000 UTC m=+0.070113124 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:50:38 localhost podman[295338]: 2025-11-23 09:50:38.100413111 +0000 UTC m=+0.154214239 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:50:38 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:50:38 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:39 localhost ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:50:39 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:50:39 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:39 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:39 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.856744) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439856811, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 254, "total_data_size": 475331, "memory_usage": 486184, "flush_reason": "Manual Compaction"} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439860555, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 290260, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15063, "largest_seqno": 15517, "table_properties": {"data_size": 287465, "index_size": 842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6400, "raw_average_key_size": 18, "raw_value_size": 281734, "raw_average_value_size": 795, "num_data_blocks": 33, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891435, "oldest_key_time": 1763891435, "file_creation_time": 1763891439, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 3859 microseconds, and 1773 cpu microseconds. Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.860605) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 290260 bytes OK Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.860631) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862091) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862114) EVENT_LOG_v1 {"time_micros": 1763891439862107, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862134) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 472392, prev total WAL file size 472392, number of live WAL files 2. Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862932) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303130' seq:72057594037927935, type:22 .. '6B760031323635' seq:0, type:0; will stop at (end) Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(283KB)], [21(14MB)] Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439863010, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15416955, "oldest_snapshot_seqno": -1} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 9939 keys, 14425287 bytes, temperature: kUnknown Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439940422, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14425287, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14367161, "index_size": 32169, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24901, "raw_key_size": 268375, "raw_average_key_size": 27, "raw_value_size": 14195328, "raw_average_value_size": 1428, "num_data_blocks": 1215, "num_entries": 9939, "num_filter_entries": 9939, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891439, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.940988) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14425287 bytes Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.942636) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.4 rd, 185.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.4 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(102.8) write-amplify(49.7) OK, records in: 10468, records dropped: 529 output_compression: NoCompression Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.942668) EVENT_LOG_v1 {"time_micros": 1763891439942654, "job": 10, "event": "compaction_finished", "compaction_time_micros": 77712, "compaction_time_cpu_micros": 41715, "output_level": 6, "num_output_files": 1, "total_output_size": 14425287, "num_input_records": 10468, "num_output_records": 9939, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439942856, "job": 10, "event": "table_file_deletion", "file_number": 23} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891439945203, "job": 10, "event": "table_file_deletion", "file_number": 21} Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.862790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945258) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:39 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:50:39.945266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.028 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.028 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:40 localhost nova_compute[281952]: 2025-11-23 09:50:40.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:40 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:50:40 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:50:40 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:40 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:40 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:40 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:40 localhost ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:50:40 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:50:40 localhost ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:41 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:41 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:41 localhost podman[240668]: time="2025-11-23T09:50:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:50:41 localhost podman[240668]: @ - - [23/Nov/2025:09:50:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:50:41 localhost podman[240668]: @ - - [23/Nov/2025:09:50:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18701 "" "Go-http-client/1.1" Nov 23 04:50:42 localhost ceph-mon[289735]: Saving service mon spec with placement label:mon Nov 23 04:50:42 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:50:42 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:50:42 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:42 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:42 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:50:43 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:43 localhost ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:50:43 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:50:43 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:43 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:43 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:43 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:50:43 localhost podman[295432]: Nov 23 04:50:43 localhost podman[295432]: 2025-11-23 09:50:43.957810066 +0000 UTC m=+0.063935554 container create 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, RELEASE=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64) Nov 23 04:50:43 localhost systemd[1]: Started libpod-conmon-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope. Nov 23 04:50:44 localhost systemd[1]: Started libcrun container. Nov 23 04:50:44 localhost podman[295432]: 2025-11-23 09:50:43.925231871 +0000 UTC m=+0.031357329 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:44 localhost podman[295432]: 2025-11-23 09:50:44.028865688 +0000 UTC m=+0.134991116 container init 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, release=553, vcs-type=git, ceph=True) Nov 23 04:50:44 localhost systemd[1]: tmp-crun.w25BR2.mount: Deactivated successfully. Nov 23 04:50:44 localhost podman[295432]: 2025-11-23 09:50:44.043965064 +0000 UTC m=+0.150090482 container start 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , release=553, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:50:44 localhost podman[295432]: 2025-11-23 09:50:44.044949114 +0000 UTC m=+0.151074562 container attach 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, release=553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:50:44 localhost laughing_lovelace[295448]: 167 167 Nov 23 04:50:44 localhost systemd[1]: libpod-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope: Deactivated successfully. Nov 23 04:50:44 localhost podman[295432]: 2025-11-23 09:50:44.051118734 +0000 UTC m=+0.157244232 container died 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., version=7, ceph=True) Nov 23 04:50:44 localhost podman[295453]: 2025-11-23 09:50:44.152617955 +0000 UTC m=+0.089152090 container remove 9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_lovelace, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=553) Nov 23 04:50:44 localhost systemd[1]: libpod-conmon-9d6f7f9458aaa026f1506edbb8da0208f36744f2900a80f3e05104abc72e2d1f.scope: Deactivated successfully. Nov 23 04:50:44 localhost ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:50:44 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:50:44 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:44 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:44 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:50:44 localhost podman[295523]: Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.814190586 +0000 UTC m=+0.065110320 container create 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553) Nov 23 04:50:44 localhost systemd[1]: Started libpod-conmon-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope. Nov 23 04:50:44 localhost systemd[1]: Started libcrun container. Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.882272166 +0000 UTC m=+0.133191910 container init 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, io.openshift.expose-services=) Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.78319702 +0000 UTC m=+0.034116774 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.894879405 +0000 UTC m=+0.145799139 container start 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55) Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.895148134 +0000 UTC m=+0.146067888 container attach 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:50:44 localhost nifty_liskov[295539]: 167 167 Nov 23 04:50:44 localhost systemd[1]: libpod-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope: Deactivated successfully. Nov 23 04:50:44 localhost podman[295523]: 2025-11-23 09:50:44.897756114 +0000 UTC m=+0.148675878 container died 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:50:44 localhost systemd[1]: var-lib-containers-storage-overlay-917432b2277fe1b1fa97d837ab7ed86a34cd6fd4d8ff5631b4888c907f975a36-merged.mount: Deactivated successfully. Nov 23 04:50:44 localhost systemd[1]: var-lib-containers-storage-overlay-dd0aba95c9dc7660b2050c506df3a775fc72ef170499f6c3180ebd13d2879ebe-merged.mount: Deactivated successfully. Nov 23 04:50:44 localhost podman[295544]: 2025-11-23 09:50:44.998914365 +0000 UTC m=+0.088689747 container remove 6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_liskov, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=) Nov 23 04:50:45 localhost systemd[1]: libpod-conmon-6e60904f008ad3b195e9e729fd8f07ec55748cb7795d95176ffee97dab8eb3ef.scope: Deactivated successfully. Nov 23 04:50:45 localhost nova_compute[281952]: 2025-11-23 09:50:45.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:45 localhost podman[295622]: Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.824250637 +0000 UTC m=+0.067498823 container create 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:50:45 localhost ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:50:45 localhost ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:50:45 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:45 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:45 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:45 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:45 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:50:45 localhost systemd[1]: Started libpod-conmon-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope. Nov 23 04:50:45 localhost systemd[1]: Started libcrun container. Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.788186524 +0000 UTC m=+0.031434840 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.896372412 +0000 UTC m=+0.139620598 container init 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, version=7, architecture=x86_64, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main) Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.903643437 +0000 UTC m=+0.146891613 container start 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, distribution-scope=public) Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.903925016 +0000 UTC m=+0.147173252 container attach 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, description=Red Hat Ceph Storage 7, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vcs-type=git, RELEASE=main, distribution-scope=public, GIT_BRANCH=main) Nov 23 04:50:45 localhost friendly_sanderson[295637]: 167 167 Nov 23 04:50:45 localhost systemd[1]: libpod-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope: Deactivated successfully. Nov 23 04:50:45 localhost podman[295622]: 2025-11-23 09:50:45.906595198 +0000 UTC m=+0.149843394 container died 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Nov 23 04:50:45 localhost systemd[1]: var-lib-containers-storage-overlay-478051fc5f52bdfc203655a6c47efb3e04afb9e5da28239d050065c7dbd05235-merged.mount: Deactivated successfully. Nov 23 04:50:46 localhost podman[295642]: 2025-11-23 09:50:46.002853907 +0000 UTC m=+0.087575562 container remove 41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_sanderson, distribution-scope=public, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Nov 23 04:50:46 localhost systemd[1]: libpod-conmon-41efd900f7c07fec4e18e618ce67be892bbb8f137a70eb7e77f3b72964626e2d.scope: Deactivated successfully. Nov 23 04:50:46 localhost podman[295720]: Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.81297057 +0000 UTC m=+0.062662614 container create 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, RELEASE=main, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:50:46 localhost systemd[1]: Started libpod-conmon-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope. Nov 23 04:50:46 localhost systemd[1]: Started libcrun container. Nov 23 04:50:46 localhost ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:50:46 localhost ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:46 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.879936536 +0000 UTC m=+0.129628580 container init 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, vcs-type=git, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.890017537 +0000 UTC m=+0.139709551 container start 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=553, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.890242384 +0000 UTC m=+0.139934428 container attach 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=553) Nov 23 04:50:46 localhost sharp_benz[295735]: 167 167 Nov 23 04:50:46 localhost systemd[1]: libpod-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope: Deactivated successfully. Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.793488079 +0000 UTC m=+0.043180113 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:46 localhost podman[295720]: 2025-11-23 09:50:46.892985549 +0000 UTC m=+0.142677603 container died 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True) Nov 23 04:50:46 localhost systemd[1]: var-lib-containers-storage-overlay-5a97c8649ddad270819df4c60c1a350127d60bddbf3b62ad28ce97e19a5a9013-merged.mount: Deactivated successfully. Nov 23 04:50:46 localhost podman[295740]: 2025-11-23 09:50:46.988349581 +0000 UTC m=+0.084846479 container remove 9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_benz, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, RELEASE=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:50:46 localhost systemd[1]: libpod-conmon-9185c66721650058eea89df7377c1c221eb6fb38ac03304dbfded8bcfbbe9d1a.scope: Deactivated successfully. Nov 23 04:50:47 localhost podman[295809]: Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.685506919 +0000 UTC m=+0.074979475 container create e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main) Nov 23 04:50:47 localhost systemd[1]: Started libpod-conmon-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope. Nov 23 04:50:47 localhost systemd[1]: Started libcrun container. Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.741071972 +0000 UTC m=+0.130544538 container init e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, io.openshift.expose-services=, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main) Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.749813613 +0000 UTC m=+0.139286179 container start e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc.) Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.750137562 +0000 UTC m=+0.139610118 container attach e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Nov 23 04:50:47 localhost practical_mirzakhani[295824]: 167 167 Nov 23 04:50:47 localhost systemd[1]: libpod-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope: Deactivated successfully. Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.752432042 +0000 UTC m=+0.141904618 container died e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, vendor=Red Hat, Inc.) Nov 23 04:50:47 localhost podman[295809]: 2025-11-23 09:50:47.654727479 +0000 UTC m=+0.044200065 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:47 localhost podman[295829]: 2025-11-23 09:50:47.848159967 +0000 UTC m=+0.082482597 container remove e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mirzakhani, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.33.12, version=7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:50:47 localhost systemd[1]: libpod-conmon-e172850c7f74bc9a306bb1ada88e94a5dd81f27c4be5c2a3d6efb1fa223ff2ec.scope: Deactivated successfully. Nov 23 04:50:47 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:50:47 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:50:47 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:47 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:47 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:47 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:50:47 localhost systemd[1]: var-lib-containers-storage-overlay-d3572bb347915371c7593debeb562cd783f888af3a100444ab4b71a2119ae1a9-merged.mount: Deactivated successfully. Nov 23 04:50:48 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:48 localhost sshd[295882]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:50:48 localhost podman[295901]: Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.572612156 +0000 UTC m=+0.078216144 container create cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, RELEASE=main, name=rhceph, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public) Nov 23 04:50:48 localhost systemd[1]: Started libpod-conmon-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope. Nov 23 04:50:48 localhost systemd[1]: Started libcrun container. Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.638926172 +0000 UTC m=+0.144530120 container init cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, version=7, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.541669641 +0000 UTC m=+0.047273629 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.647746814 +0000 UTC m=+0.153350772 container start cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, release=553, distribution-scope=public) Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.648083464 +0000 UTC m=+0.153687472 container attach cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, release=553, maintainer=Guillaume Abrioux ) Nov 23 04:50:48 localhost romantic_almeida[295916]: 167 167 Nov 23 04:50:48 localhost systemd[1]: libpod-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope: Deactivated successfully. Nov 23 04:50:48 localhost podman[295901]: 2025-11-23 09:50:48.652512411 +0000 UTC m=+0.158116709 container died cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.openshift.expose-services=, version=7, release=553) Nov 23 04:50:48 localhost podman[295921]: 2025-11-23 09:50:48.766504467 +0000 UTC m=+0.098151249 container remove cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_almeida, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, vendor=Red Hat, Inc., version=7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git) Nov 23 04:50:48 localhost systemd[1]: libpod-conmon-cf3222dab7dab73f6c812d1e6e62c24d6f6c82ee5b7732efc3c95135535dc39a.scope: Deactivated successfully. Nov 23 04:50:48 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:50:48 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:48 localhost ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:50:48 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:48 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:50:48 localhost systemd[1]: var-lib-containers-storage-overlay-e12745678485b3c6ddaf9e0b8d28ee823299482ab090dcc5737606e5e23bdca7-merged.mount: Deactivated successfully. Nov 23 04:50:49 localhost ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.067 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.070 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.070 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.071 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.113 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:50 localhost nova_compute[281952]: 2025-11-23 09:50:50.114 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:50 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:50 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:50 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:50 localhost ceph-mon[289735]: from='mgr.14196 ' entity='mgr.np0005532582.gilwrz' Nov 23 04:50:50 localhost ceph-mon[289735]: from='mgr.14196 172.18.0.104:0/3739487529' entity='mgr.np0005532582.gilwrz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:50:51 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 e84: 6 total, 6 up, 6 in Nov 23 04:50:51 localhost systemd[1]: session-65.scope: Deactivated successfully. Nov 23 04:50:51 localhost systemd[1]: session-65.scope: Consumed 9.604s CPU time. Nov 23 04:50:51 localhost systemd-logind[761]: Session 65 logged out. Waiting for processes to exit. Nov 23 04:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:50:51 localhost systemd-logind[761]: Removed session 65. Nov 23 04:50:51 localhost podman[295938]: 2025-11-23 09:50:51.216772059 +0000 UTC m=+0.064218442 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm) Nov 23 04:50:51 localhost podman[295938]: 2025-11-23 09:50:51.227593013 +0000 UTC m=+0.075039386 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:50:51 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:50:51 localhost sshd[295958]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:50:51 localhost systemd-logind[761]: New session 66 of user ceph-admin. Nov 23 04:50:51 localhost systemd[1]: Started Session 66 of User ceph-admin. Nov 23 04:50:51 localhost ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:50:51 localhost ceph-mon[289735]: from='client.? 172.18.0.200:0/3363667457' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:50:51 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:50:51 localhost ceph-mon[289735]: Activating manager daemon np0005532584.naxwxy Nov 23 04:50:51 localhost ceph-mon[289735]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:50:51 localhost ceph-mon[289735]: Manager daemon np0005532584.naxwxy is now available Nov 23 04:50:51 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/mirror_snapshot_schedule"} : dispatch Nov 23 04:50:51 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/trash_purge_schedule"} : dispatch Nov 23 04:50:52 localhost systemd[1]: tmp-crun.0L7NXx.mount: Deactivated successfully. Nov 23 04:50:52 localhost podman[296071]: 2025-11-23 09:50:52.520401497 +0000 UTC m=+0.098439888 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, maintainer=Guillaume Abrioux , architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:50:52 localhost podman[296071]: 2025-11-23 09:50:52.648027855 +0000 UTC m=+0.226066246 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, maintainer=Guillaume Abrioux , RELEASE=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph) Nov 23 04:50:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:50:52 localhost podman[296140]: 2025-11-23 09:50:52.977455468 +0000 UTC m=+0.072769096 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:50:53 localhost podman[296140]: 2025-11-23 09:50:53.064526285 +0000 UTC m=+0.159840003 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:50:53 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:50:53 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:54 localhost ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Bus STARTING Nov 23 04:50:54 localhost ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Serving on https://172.18.0.106:7150 Nov 23 04:50:54 localhost ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Client ('172.18.0.106', 58208) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:50:54 localhost ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Serving on http://172.18.0.106:8765 Nov 23 04:50:54 localhost ceph-mon[289735]: [23/Nov/2025:09:50:52] ENGINE Bus STARTED Nov 23 04:50:54 localhost ceph-mon[289735]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 23 04:50:54 localhost ceph-mon[289735]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 23 04:50:54 localhost ceph-mon[289735]: Cluster is now healthy Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:54 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.117 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:50:55 localhost nova_compute[281952]: 2025-11-23 09:50:55.159 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd/host:np0005532582", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:50:55 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:50:55 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:50:55 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:50:55 localhost ceph-mon[289735]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:50:55 localhost ceph-mon[289735]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:50:55 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:55 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:55 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:55 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:55 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:50:56 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:56 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:56 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:56 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:56 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:57 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:50:58 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:50:58 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:50:58 localhost ceph-mon[289735]: Reconfiguring mon.np0005532582 (monmap changed)... Nov 23 04:50:58 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532582 on np0005532582.localdomain Nov 23 04:50:59 localhost openstack_network_exporter[242668]: ERROR 09:50:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:50:59 localhost openstack_network_exporter[242668]: ERROR 09:50:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:50:59 localhost openstack_network_exporter[242668]: ERROR 09:50:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:50:59 localhost openstack_network_exporter[242668]: ERROR 09:50:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:50:59 localhost openstack_network_exporter[242668]: Nov 23 04:50:59 localhost openstack_network_exporter[242668]: ERROR 09:50:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:50:59 localhost openstack_network_exporter[242668]: Nov 23 04:51:00 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:00 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:00 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:51:00 localhost ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:51:00 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:51:00 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:00 localhost nova_compute[281952]: 2025-11-23 09:51:00.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:00 localhost nova_compute[281952]: 2025-11-23 09:51:00.161 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:51:01 localhost ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:01 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:51:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:51:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:51:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:51:02 localhost systemd[1]: tmp-crun.kVJ3Zd.mount: Deactivated successfully. Nov 23 04:51:02 localhost podman[296994]: 2025-11-23 09:51:02.027551929 +0000 UTC m=+0.076574553 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., version=9.6) Nov 23 04:51:02 localhost podman[296994]: 2025-11-23 09:51:02.06515066 +0000 UTC m=+0.114173334 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.) Nov 23 04:51:02 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:51:02 localhost podman[296992]: 2025-11-23 09:51:02.067305886 +0000 UTC m=+0.118692273 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS) Nov 23 04:51:02 localhost ceph-mon[289735]: Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:51:02 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:51:02 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:02 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:02 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:02 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:51:02 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:02 localhost podman[296992]: 2025-11-23 09:51:02.152310239 +0000 UTC m=+0.203696636 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 04:51:02 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:51:02 localhost podman[296993]: 2025-11-23 09:51:02.122720676 +0000 UTC m=+0.173464573 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:51:02 localhost podman[296993]: 2025-11-23 09:51:02.202088524 +0000 UTC m=+0.252832371 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Nov 23 04:51:02 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:51:03 localhost ceph-mon[289735]: mon.np0005532585@3(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:03 localhost nova_compute[281952]: 2025-11-23 09:51:03.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.206 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.207 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.207 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:51:05 localhost nova_compute[281952]: 2025-11-23 09:51:05.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:51:06 localhost nova_compute[281952]: 2025-11-23 09:51:06.057 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:51:06 localhost nova_compute[281952]: 2025-11-23 09:51:06.057 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:51:06 localhost nova_compute[281952]: 2025-11-23 09:51:06.058 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:51:06 localhost nova_compute[281952]: 2025-11-23 09:51:06.058 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.064 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.092 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.093 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.094 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.115 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.116 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:51:07 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:07 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Nov 23 04:51:07 localhost ceph-mon[289735]: mon.np0005532585@3(peon) e10 my rank is now 2 (was 3) Nov 23 04:51:07 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Nov 23 04:51:07 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Nov 23 04:51:07 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb1591e0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Nov 23 04:51:07 localhost ceph-mon[289735]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:51:07 localhost ceph-mon[289735]: paxos.2).electionLogic(40) init, last seen epoch 40 Nov 23 04:51:07 localhost ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:51:07 localhost ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.556 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.621 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.622 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.776 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.776 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11693MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.777 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.777 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.852 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.852 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.853 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:51:07 localhost nova_compute[281952]: 2025-11-23 09:51:07.895 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:51:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:51:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:51:09 localhost podman[297106]: 2025-11-23 09:51:09.015070857 +0000 UTC m=+0.069099822 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:51:09 localhost podman[297107]: 2025-11-23 09:51:09.029381099 +0000 UTC m=+0.076424809 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:51:09 localhost podman[297107]: 2025-11-23 09:51:09.040161571 +0000 UTC m=+0.087205271 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:51:09 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:51:09 localhost podman[297106]: 2025-11-23 09:51:09.054338368 +0000 UTC m=+0.108367363 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:51:09 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:51:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:51:09.287 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:51:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:51:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:51:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:51:09.288 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532585@2(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532585@2(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:51:09 localhost ceph-mon[289735]: Remove daemons mon.np0005532582 Nov 23 04:51:09 localhost ceph-mon[289735]: Safe to remove mon.np0005532582: new quorum should be ['np0005532583', 'np0005532586', 'np0005532585', 'np0005532584'] (from ['np0005532583', 'np0005532586', 'np0005532585', 'np0005532584']) Nov 23 04:51:09 localhost ceph-mon[289735]: Removing monitor np0005532582 from monmap... Nov 23 04:51:09 localhost ceph-mon[289735]: Removing daemon mon.np0005532582 from np0005532582.localdomain -- ports [] Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532586 calling monitor election Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532584 calling monitor election Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532585 calling monitor election Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532583 calling monitor election Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532583 is new leader, mons np0005532583,np0005532586,np0005532585,np0005532584 in quorum (ranks 0,1,2,3) Nov 23 04:51:09 localhost ceph-mon[289735]: overall HEALTH_OK Nov 23 04:51:09 localhost ceph-mon[289735]: mon.np0005532585@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:51:09 localhost ceph-mon[289735]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1036816168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.775 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.783 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.808 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.811 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.811 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.932 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.932 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:09 localhost nova_compute[281952]: 2025-11-23 09:51:09.956 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:10 localhost nova_compute[281952]: 2025-11-23 09:51:10.251 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.805 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.806 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a573f0-6b66-4e6b-aaa1-8ff41655e050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.807355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0e95f5a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': 'e6a50e1018370596dedffe2c113df05fe7208268793dfef62a4129a91249c8e5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.807355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0e9777e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '1dcee196c66de7a06e246a5b2ee1574f4c1de7b49ea619097a3937abc7bf4ce7'}]}, 'timestamp': '2025-11-23 09:51:10.822696', '_unique_id': '7986bf046f74410389bafba07e51a27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.826 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.826 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.830 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '601f3146-abd9-4c30-a6ff-e604269168b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.826658', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0eabe54-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '2d140033d85caac5eeda3009d8da6a7ebf3cc9a1a4e6be8c80add60d3c4620d8'}]}, 'timestamp': '2025-11-23 09:51:10.831050', '_unique_id': '1e186175852348c6b393bb6dfb97e480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.832 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49d9d6eb-ff91-479e-b86d-e2008edc835c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.834138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0efc1ce-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '789b10708935340e399dd8b339824807c5d50cc339d35d5c9a069be8ea592dd3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.834138', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0efd830-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'e6501661dc2c62852b6eb01e3f40f317bdc046ffa477c9e7e61bf17390fe501b'}]}, 'timestamp': '2025-11-23 09:51:10.864446', '_unique_id': '50cf2f7612cc4e4487dd264676fa5e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29d0cf6d-2c5f-49ae-9611-21d712cd0824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:51:10.867554', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f0f3673e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.064666941, 'message_signature': '5fd6a625a714766fb5af35958e01826ed84f1f66d715d703051d9f30272046c1'}]}, 'timestamp': '2025-11-23 09:51:10.887772', '_unique_id': '427437a78f2c4ba89f4226a2538b8217'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d3daf3d-02d0-4c94-9be7-9d09469f96e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.890500', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f3e5ba-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '3fd6da6fe57d1c2c205c5b41cfe50687265e5f574b8aa6c3a0542b8ecee1a28b'}]}, 'timestamp': '2025-11-23 09:51:10.891023', '_unique_id': 'f69ae16018224fb2b8000299f694c9e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5356448-ed75-4964-8d49-ffe452c81782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.893471', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f45982-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'a537bdf488741e5b91e5a9bdcdcd1298329f89e6edaa0e37edd66bb2c552e1ff'}]}, 'timestamp': '2025-11-23 09:51:10.893984', '_unique_id': '66e6e1302a014f9984d90b9e211c47b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.896 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ab2133-5371-48ee-b85c-2e0c058ef6bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.896183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f4c4da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'b584ea59c482d52b152a8dc7e4301b591172990599571d9f6885fb80ba3e5a28'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.896183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f4d8da-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '856ee90b8dc8ecaef7626af316036de1c8638559e519919d37c451d3a0a5189d'}]}, 'timestamp': '2025-11-23 09:51:10.897249', '_unique_id': 'fe8a8f05fba4454793fb4d2a281e7767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a42a68-b7c7-4dbe-953b-ad5d39a7221b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.899733', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f54f7c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'eba635c764846702664af2ef3f4456754261356fa082ca05b6ee245f85834123'}]}, 'timestamp': '2025-11-23 09:51:10.900245', '_unique_id': '38fd3d0b2cb743f0bf6d75f4c220bdc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '694340f5-9741-4816-a880-29e01367cf19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.902654', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f5c29a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '6f8acb5223d71c3aa7e3c7941c262417a32a2923bf6362e5a67aae096f77cae3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.902654', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f5d3de-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'd933e89c9767edf91c786bf178d9a5a9db4e09c3823ffb87f8c8e67fdac743df'}]}, 'timestamp': '2025-11-23 09:51:10.903603', '_unique_id': 'ebf351737b17480b844e05173bd963e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.905 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '747aa262-5fff-4914-8617-1f217c61f493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.905854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f63e3c-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '3a63689698f9ce479e5479887afcefaa7cbde51d5b6aba5f5d350d16db12301c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.905854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f65070-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '1e28cc3e66e5cca0fcb7d542fb5884412fbc3725ebe63e7ee2755b64b5f88ac6'}]}, 'timestamp': '2025-11-23 09:51:10.906795', '_unique_id': 'b7d9b2d155eb4457906c3b58c234b09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.907 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '853c0b40-3b26-45c7-811c-6e6658257e4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.909177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f6c0b4-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': 'a368c7833b3363dfee606bd9b78dca7f1feb4fa43b23b623586421307fa130b9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.909177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f6d2f2-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11413.984983742, 'message_signature': '9b6e747a5f12ee6cb71dca749bd948efbe9f12fb3a1b719a3e9ab8351aa4f158'}]}, 'timestamp': '2025-11-23 09:51:10.910137', '_unique_id': '7bdcd83a05f442b7b5e6b37204f2f31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '334c423a-9315-49a9-8b63-140f9866ef52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.912736', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f74bec-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '8569f6e62fac87a2ca45ff54bd75a1d1b36174c7bc8480a845851847e6541b32'}]}, 'timestamp': '2025-11-23 09:51:10.913263', '_unique_id': '8d4cc1c93f5748f780d88cfe33040503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82d07b94-ed25-40fc-826a-cbe9c22234c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.915580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f7ba28-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'c28477c3c047ff47ab1e446b08a83f7e6db2f025452158fc48f5ca76f0f7d9db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.915580', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f7d076-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'b8bb84872df4be6d64efb4d777e0ae0b87074f70cfbbeb3270972e7fefc10e8c'}]}, 'timestamp': '2025-11-23 09:51:10.916628', '_unique_id': '533c7cf35ceb49a0ba9d36b432359b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 13490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eda64f0c-ba1f-4145-ba95-2a75926d0de0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13490000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:51:10.918052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f0f81608-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.064666941, 'message_signature': '2898c540613a23299f8a4e9d1d756a647616f0fccc667c191966b4e922575cc1'}]}, 'timestamp': '2025-11-23 09:51:10.918333', '_unique_id': '0bd755fdf3084fa0b607d77669d655c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.918 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5636759-62cb-45f6-9b44-33560dd6dde6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.919652', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f8547e-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': 'a99d45e11a7d3ce264ac59c9f2198684d5d2e8594d2476a83c4cc579437fb17a'}]}, 'timestamp': '2025-11-23 09:51:10.919963', '_unique_id': '58af22b5b93c44f1bac61e5a4e969f5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a98e950-7e91-4287-85c9-f61cbdeb6147', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.921401', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f89916-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '5cb5fef50b3b39d5b09ec3b7de4bb9d67349a0c9e3c172ad8934308e2bfa3bf5'}]}, 'timestamp': '2025-11-23 09:51:10.921699', '_unique_id': '4e3596895c594787b17852b9bc6ca590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fce0e00e-aaee-4404-9832-b73868dbcd48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.923160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f8dd90-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '1cd41788db4d2934381ac29e6a36403b52fcfbd3cd4e15e28fa4156682182869'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.923160', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f8e8ee-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '27d1f57f8a4fef1e9e1119d9d98a97d50fee9b67e44417839f11723cbed93b29'}]}, 'timestamp': '2025-11-23 09:51:10.923727', '_unique_id': '329c3c20a5224545863157faf6e31f70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262abf73-2b9b-46e3-ae77-9da8b3c7d347', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.925127', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f92a7a-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '4210aff083d9660d31edfeb4552d779cbff2cbe6804337f134714eec3cbe3686'}]}, 'timestamp': '2025-11-23 09:51:10.925420', '_unique_id': '409135656b394b72b595640768bb073b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b046b14-a0cd-43fb-891d-0fe024ee23d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:51:10.927242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f0f97d18-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': 'd7680ee9d4825b751f0ce32014eefb8d006370904ff9dde678f792f95587aaeb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:51:10.927242', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f0f98916-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.011783299, 'message_signature': '8fcd23080fe7bcb6af579d930944df7ba8b7efbf85f727aaf3cc82dc39efbd6b'}]}, 'timestamp': '2025-11-23 09:51:10.927828', '_unique_id': 'b182b52deb2c40c38ad006887e5556e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0040a4c1-897f-4bd4-b0ed-1ebff5575028', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.929273', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0f9cca0-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '7d6ed2c09aa486defc547aa4c1558ddde6c6aa6bee60ce7f4d408efd13bbecf5'}]}, 'timestamp': '2025-11-23 09:51:10.929575', '_unique_id': '8f02a1b991a04c888b810b4e7242ed4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf86e683-1b41-4e52-8293-902dbd2a0446', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:51:10.931010', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'f0fa1048-c851-11f0-bde4-fa163e72a351', 'monotonic_time': 11414.004314729, 'message_signature': '31efdb954757a95d8ca930947272468951e17ee11ab52e2015145a09e3ff0f67'}]}, 'timestamp': '2025-11-23 09:51:10.931304', '_unique_id': '70f03cf8dd9646fb8b1c599138f23d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:51:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:51:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:51:11 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:11 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:11 localhost ceph-mon[289735]: Removed label mon from host np0005532582.localdomain Nov 23 04:51:11 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:11 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:51:11 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/etc/ceph/ceph.conf Nov 23 04:51:11 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:51:11 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:51:11 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:51:11 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:51:11 localhost podman[240668]: time="2025-11-23T09:51:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:51:11 localhost podman[240668]: @ - - [23/Nov/2025:09:51:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:51:11 localhost podman[240668]: @ - - [23/Nov/2025:09:51:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18698 "" "Go-http-client/1.1" Nov 23 04:51:12 localhost ceph-mon[289735]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:12 localhost ceph-mon[289735]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:12 localhost ceph-mon[289735]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:12 localhost ceph-mon[289735]: Updating np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:12 localhost ceph-mon[289735]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: Removed label mgr from host np0005532582.localdomain Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:12 localhost ceph-mon[289735]: Removing daemon mgr.np0005532582.gilwrz from np0005532582.localdomain -- ports [8765] Nov 23 04:51:13 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:14 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:14 localhost ceph-mon[289735]: Removed label _admin from host np0005532582.localdomain Nov 23 04:51:15 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "mgr.np0005532582.gilwrz"} : dispatch Nov 23 04:51:15 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005532582.gilwrz"}]': finished Nov 23 04:51:15 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:15 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:15 localhost nova_compute[281952]: 2025-11-23 09:51:15.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:16 localhost ceph-mon[289735]: Removing key for mgr.np0005532582.gilwrz Nov 23 04:51:16 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:16 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:16 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:51:17 localhost ceph-mon[289735]: Removing np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:51:17 localhost ceph-mon[289735]: Removing np0005532582.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:51:17 localhost ceph-mon[289735]: Removing np0005532582.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:51:17 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:17 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:17 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:17 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:17 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532582.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:51:18 localhost ceph-mon[289735]: Reconfiguring crash.np0005532582 (monmap changed)... Nov 23 04:51:18 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532582 on np0005532582.localdomain Nov 23 04:51:18 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:18 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:18 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:51:18 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:19 localhost ceph-mon[289735]: Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:51:19 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:51:19 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:19 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:19 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532583.orhywt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:51:20 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532583.orhywt (monmap changed)... Nov 23 04:51:20 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532583.orhywt on np0005532583.localdomain Nov 23 04:51:20 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:20 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:20 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.256 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:20 localhost nova_compute[281952]: 2025-11-23 09:51:20.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:21 localhost ceph-mon[289735]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:51:21 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:51:21 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:21 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:21 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:51:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:51:22 localhost systemd[1]: tmp-crun.ctg3Jb.mount: Deactivated successfully. Nov 23 04:51:22 localhost podman[297514]: 2025-11-23 09:51:22.02726085 +0000 UTC m=+0.083885619 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:51:22 localhost podman[297514]: 2025-11-23 09:51:22.042289164 +0000 UTC m=+0.098913893 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 23 04:51:22 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:51:22 localhost ceph-mon[289735]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:51:22 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:51:22 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:22 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:22 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:51:23 localhost ceph-mon[289735]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:51:23 localhost ceph-mon[289735]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:51:23 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:23 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:23 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:51:23 localhost ceph-mon[289735]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:51:23 localhost ceph-mon[289735]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:51:23 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:51:24 localhost podman[297533]: 2025-11-23 09:51:24.017499841 +0000 UTC m=+0.073911702 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:51:24 localhost podman[297533]: 2025-11-23 09:51:24.050397476 +0000 UTC m=+0.106809337 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:51:24 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:51:24 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:51:24 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:24 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:51:25 localhost nova_compute[281952]: 2025-11-23 09:51:25.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:25 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:51:25 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:51:25 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:25 localhost ceph-mon[289735]: Added label _no_schedule to host np0005532582.localdomain Nov 23 04:51:25 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:25 localhost ceph-mon[289735]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532582.localdomain Nov 23 04:51:25 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:25 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:25 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:51:26 localhost podman[297608]: Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.874007595 +0000 UTC m=+0.074711626 container create 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64) Nov 23 04:51:26 localhost ceph-mon[289735]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:51:26 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:51:26 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:26 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:26 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:51:26 localhost systemd[1]: Started libpod-conmon-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope. Nov 23 04:51:26 localhost systemd[1]: Started libcrun container. Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.843518515 +0000 UTC m=+0.044222566 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.943413236 +0000 UTC m=+0.144117267 container init 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.openshift.tags=rhceph ceph, version=7, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git) Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.955715176 +0000 UTC m=+0.156419197 container start 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.956114838 +0000 UTC m=+0.156818899 container attach 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, name=rhceph, release=553, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git) Nov 23 04:51:26 localhost jolly_black[297623]: 167 167 Nov 23 04:51:26 localhost systemd[1]: libpod-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope: Deactivated successfully. Nov 23 04:51:26 localhost podman[297608]: 2025-11-23 09:51:26.972805613 +0000 UTC m=+0.173509604 container died 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:27 localhost podman[297628]: 2025-11-23 09:51:27.055611517 +0000 UTC m=+0.071617969 container remove 0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_black, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:27 localhost systemd[1]: libpod-conmon-0cb7b3b3df718a54ec5817bf4807d24260a4dcdecd1c478415aeb07a14597f1b.scope: Deactivated successfully. Nov 23 04:51:27 localhost podman[297697]: Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.65471325 +0000 UTC m=+0.052070116 container create f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:27 localhost systemd[1]: Started libpod-conmon-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope. Nov 23 04:51:27 localhost systemd[1]: Started libcrun container. Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.705014902 +0000 UTC m=+0.102371788 container init f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True) Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.713985919 +0000 UTC m=+0.111342795 container start f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:51:27 localhost jolly_hamilton[297712]: 167 167 Nov 23 04:51:27 localhost systemd[1]: libpod-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope: Deactivated successfully. Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.714201325 +0000 UTC m=+0.111558201 container attach f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, release=553, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.719153088 +0000 UTC m=+0.116509994 container died f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, RELEASE=main, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55) Nov 23 04:51:27 localhost podman[297697]: 2025-11-23 09:51:27.631270978 +0000 UTC m=+0.028627834 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:27 localhost podman[297717]: 2025-11-23 09:51:27.810432445 +0000 UTC m=+0.082604710 container remove f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_hamilton, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55) Nov 23 04:51:27 localhost systemd[1]: libpod-conmon-f982188de59e7a512faeac8791bef11ddfb2ea96049021231af72258f3d5dcfc.scope: Deactivated successfully. Nov 23 04:51:27 localhost systemd[1]: var-lib-containers-storage-overlay-de045266c0b70436cf46a9f4dbb8c09f353a59902278a1e1bd6eb1745b0fd095-merged.mount: Deactivated successfully. Nov 23 04:51:28 localhost ceph-mon[289735]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:51:28 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:51:28 localhost ceph-mon[289735]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain"} : dispatch Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain"}]': finished Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:28 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:51:28 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:28 localhost podman[297795]: Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.60927435 +0000 UTC m=+0.075881862 container create dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:51:28 localhost systemd[1]: Started libpod-conmon-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope. Nov 23 04:51:28 localhost systemd[1]: Started libcrun container. Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.677875046 +0000 UTC m=+0.144482528 container init dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.578573862 +0000 UTC m=+0.045181424 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.683409396 +0000 UTC m=+0.150016878 container start dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, name=rhceph, RELEASE=main) Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.68351701 +0000 UTC m=+0.150124492 container attach dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:51:28 localhost flamboyant_bose[297810]: 167 167 Nov 23 04:51:28 localhost systemd[1]: libpod-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope: Deactivated successfully. Nov 23 04:51:28 localhost podman[297795]: 2025-11-23 09:51:28.687108631 +0000 UTC m=+0.153716173 container died dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553) Nov 23 04:51:28 localhost podman[297815]: 2025-11-23 09:51:28.787124816 +0000 UTC m=+0.086164599 container remove dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_bose, GIT_BRANCH=main, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:51:28 localhost systemd[1]: libpod-conmon-dc6dc574e6f58f83ef5c35f649f5fef073a6172cde1851fa348af4bc419ec399.scope: Deactivated successfully. Nov 23 04:51:28 localhost systemd[1]: tmp-crun.TRFUem.mount: Deactivated successfully. Nov 23 04:51:28 localhost systemd[1]: var-lib-containers-storage-overlay-479a262231933904238feb63bcc0dc71b5c814c0bc6e19ad648ea609f66ddc6a-merged.mount: Deactivated successfully. Nov 23 04:51:29 localhost ceph-mon[289735]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:51:29 localhost ceph-mon[289735]: Removed host np0005532582.localdomain Nov 23 04:51:29 localhost ceph-mon[289735]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:51:29 localhost ceph-mon[289735]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:51:29 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:29 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:29 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:51:29 localhost podman[297892]: Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.596924529 +0000 UTC m=+0.066101870 container create 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-type=git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Nov 23 04:51:29 localhost systemd[1]: Started libpod-conmon-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope. Nov 23 04:51:29 localhost systemd[1]: Started libcrun container. Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.657477797 +0000 UTC m=+0.126655138 container init 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, vcs-type=git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.expose-services=) Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.667371173 +0000 UTC m=+0.136548524 container start 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.667971931 +0000 UTC m=+0.137149342 container attach 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:29 localhost kind_kalam[297907]: 167 167 Nov 23 04:51:29 localhost systemd[1]: libpod-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope: Deactivated successfully. Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.572194447 +0000 UTC m=+0.041371778 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:29 localhost podman[297892]: 2025-11-23 09:51:29.67183865 +0000 UTC m=+0.141016011 container died 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Nov 23 04:51:29 localhost sshd[297913]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:51:29 localhost podman[297914]: 2025-11-23 09:51:29.771699751 +0000 UTC m=+0.088338446 container remove 7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kalam, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553) Nov 23 04:51:29 localhost systemd[1]: libpod-conmon-7790a267a5f4f32793410a7f9144801d3ba478abc846827964bfa2543d78facc.scope: Deactivated successfully. Nov 23 04:51:29 localhost systemd-logind[761]: New session 67 of user tripleo-admin. Nov 23 04:51:29 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 23 04:51:29 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 23 04:51:29 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 23 04:51:29 localhost systemd[1]: Starting User Manager for UID 1003... Nov 23 04:51:29 localhost systemd[1]: var-lib-containers-storage-overlay-fe6ea3d5aaa49791904d8d0a71b2aeed2bf21b87520fe710a00ce138e37436f8-merged.mount: Deactivated successfully. Nov 23 04:51:29 localhost openstack_network_exporter[242668]: ERROR 09:51:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:51:29 localhost openstack_network_exporter[242668]: ERROR 09:51:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:51:29 localhost openstack_network_exporter[242668]: ERROR 09:51:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:51:29 localhost openstack_network_exporter[242668]: ERROR 09:51:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:51:29 localhost openstack_network_exporter[242668]: Nov 23 04:51:29 localhost openstack_network_exporter[242668]: ERROR 09:51:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:51:29 localhost openstack_network_exporter[242668]: Nov 23 04:51:30 localhost systemd[297934]: Queued start job for default target Main User Target. Nov 23 04:51:30 localhost systemd[297934]: Created slice User Application Slice. Nov 23 04:51:30 localhost systemd[297934]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 04:51:30 localhost systemd[297934]: Started Daily Cleanup of User's Temporary Directories. Nov 23 04:51:30 localhost systemd[297934]: Reached target Paths. Nov 23 04:51:30 localhost systemd[297934]: Reached target Timers. Nov 23 04:51:30 localhost systemd[297934]: Starting D-Bus User Message Bus Socket... Nov 23 04:51:30 localhost systemd[297934]: Starting Create User's Volatile Files and Directories... Nov 23 04:51:30 localhost systemd[297934]: Listening on D-Bus User Message Bus Socket. Nov 23 04:51:30 localhost systemd[297934]: Reached target Sockets. Nov 23 04:51:30 localhost systemd[297934]: Finished Create User's Volatile Files and Directories. Nov 23 04:51:30 localhost systemd[297934]: Reached target Basic System. Nov 23 04:51:30 localhost systemd[1]: Started User Manager for UID 1003. Nov 23 04:51:30 localhost systemd[297934]: Reached target Main User Target. Nov 23 04:51:30 localhost systemd[297934]: Startup finished in 162ms. Nov 23 04:51:30 localhost systemd[1]: Started Session 67 of User tripleo-admin. Nov 23 04:51:30 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:51:30 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:51:30 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:30 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:30 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:51:30 localhost nova_compute[281952]: 2025-11-23 09:51:30.261 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:30 localhost podman[298077]: Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.478668022 +0000 UTC m=+0.076466970 container create 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, io.buildah.version=1.33.12) Nov 23 04:51:30 localhost systemd[1]: Started libpod-conmon-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope. Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.447790669 +0000 UTC m=+0.045589627 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:30 localhost systemd[1]: Started libcrun container. Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.577217712 +0000 UTC m=+0.175016660 container init 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.587752797 +0000 UTC m=+0.185551745 container start 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, version=7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.588084627 +0000 UTC m=+0.185883625 container attach 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=) Nov 23 04:51:30 localhost vigorous_chandrasekhar[298109]: 167 167 Nov 23 04:51:30 localhost systemd[1]: libpod-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope: Deactivated successfully. Nov 23 04:51:30 localhost podman[298077]: 2025-11-23 09:51:30.593012839 +0000 UTC m=+0.190811837 container died 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Nov 23 04:51:30 localhost podman[298130]: 2025-11-23 09:51:30.700813226 +0000 UTC m=+0.094329852 container remove 8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_chandrasekhar, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:30 localhost systemd[1]: libpod-conmon-8848252e5c1ab4a9617f5c76a972e7e261e44fe37b1da02b49dbd517fab6c704.scope: Deactivated successfully. Nov 23 04:51:30 localhost python3[298167]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.104/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 23 04:51:30 localhost systemd[1]: var-lib-containers-storage-overlay-0869e6f5bd71a32330c1fe830274780e13e8dc09eb2283d10474c8ffc37805ab-merged.mount: Deactivated successfully. Nov 23 04:51:31 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:51:31 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:51:31 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:31 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:31 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:51:31 localhost ceph-mon[289735]: Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:51:31 localhost ceph-mon[289735]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:51:31 localhost podman[298295]: Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.402355358 +0000 UTC m=+0.084679463 container create 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:51:31 localhost systemd[1]: Started libpod-conmon-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope. Nov 23 04:51:31 localhost systemd[1]: Started libcrun container. Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.455917011 +0000 UTC m=+0.138241146 container init 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, ceph=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc.) Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.462932397 +0000 UTC m=+0.145256492 container start 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Nov 23 04:51:31 localhost busy_shtern[298338]: 167 167 Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.463156184 +0000 UTC m=+0.145480329 container attach 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, io.openshift.tags=rhceph ceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Nov 23 04:51:31 localhost systemd[1]: libpod-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope: Deactivated successfully. Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.466193908 +0000 UTC m=+0.148518033 container died 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=553, RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:31 localhost podman[298295]: 2025-11-23 09:51:31.371324311 +0000 UTC m=+0.053648496 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:31 localhost podman[298349]: 2025-11-23 09:51:31.531281976 +0000 UTC m=+0.061732206 container remove 9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_shtern, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container) Nov 23 04:51:31 localhost systemd[1]: libpod-conmon-9f54aec98eddba62d22aeb8a26e4e37b1a30ef440e2bf4bd26f416f1ec9dcb32.scope: Deactivated successfully. Nov 23 04:51:31 localhost python3[298403]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.104/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:51:31 localhost systemd[1]: var-lib-containers-storage-overlay-01e53491aa1cafe287592c7c2fbb8b8ac2528f37f535f1d3c7046663c850de0c-merged.mount: Deactivated successfully. Nov 23 04:51:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:51:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:51:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:51:32 localhost podman[298511]: 2025-11-23 09:51:32.297905427 +0000 UTC m=+0.085681735 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:51:32 localhost podman[298511]: 2025-11-23 09:51:32.336245569 +0000 UTC m=+0.124021827 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 04:51:32 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:51:32 localhost podman[298512]: 2025-11-23 09:51:32.35409701 +0000 UTC m=+0.138769202 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:51:32 localhost podman[298512]: 2025-11-23 09:51:32.369223157 +0000 UTC m=+0.153895379 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 04:51:32 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:51:32 localhost podman[298530]: 2025-11-23 09:51:32.413279106 +0000 UTC m=+0.184089910 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:51:32 localhost podman[298530]: 2025-11-23 09:51:32.447347167 +0000 UTC m=+0.218157931 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:51:32 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:51:32 localhost python3[298597]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.104 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.611693) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492611751, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2583, "num_deletes": 254, "total_data_size": 8106438, "memory_usage": 8660384, "flush_reason": "Manual Compaction"} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492631483, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4896659, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15522, "largest_seqno": 18100, "table_properties": {"data_size": 4885941, "index_size": 6583, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27662, "raw_average_key_size": 22, "raw_value_size": 4862418, "raw_average_value_size": 3969, "num_data_blocks": 286, "num_entries": 1225, "num_filter_entries": 1225, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891440, "oldest_key_time": 1763891440, "file_creation_time": 1763891492, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 19849 microseconds, and 9831 cpu microseconds. Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.631543) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4896659 bytes OK Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.631571) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633076) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633097) EVENT_LOG_v1 {"time_micros": 1763891492633090, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.633120) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8093887, prev total WAL file size 8093887, number of live WAL files 2. Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.634593) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4781KB)], [24(13MB)] Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492634656, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 19321946, "oldest_snapshot_seqno": -1} Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:51:32 localhost ceph-mon[289735]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:51:32 localhost ceph-mon[289735]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:32 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10612 keys, 16121559 bytes, temperature: kUnknown Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492716461, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 16121559, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16058820, "index_size": 35118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 284598, "raw_average_key_size": 26, "raw_value_size": 15875185, "raw_average_value_size": 1495, "num_data_blocks": 1346, "num_entries": 10612, "num_filter_entries": 10612, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891318, "oldest_key_time": 0, "file_creation_time": 1763891492, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "f90877de-8e0c-4aa9-bd89-60d6d2f6e09f", "db_session_id": "8ON8PRI8V1RJ4RVNWHFL", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.716940) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 16121559 bytes Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.718580) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.7 rd, 196.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.7, 13.8 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(7.2) write-amplify(3.3) OK, records in: 11164, records dropped: 552 output_compression: NoCompression Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.718614) EVENT_LOG_v1 {"time_micros": 1763891492718600, "job": 12, "event": "compaction_finished", "compaction_time_micros": 81977, "compaction_time_cpu_micros": 49452, "output_level": 6, "num_output_files": 1, "total_output_size": 16121559, "num_input_records": 11164, "num_output_records": 10612, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492719489, "job": 12, "event": "table_file_deletion", "file_number": 26} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891492721835, "job": 12, "event": "table_file_deletion", "file_number": 24} Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.634526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost ceph-mon[289735]: rocksdb: (Original Log Time 2025/11/23-09:51:32.721961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:51:32 localhost systemd[1]: tmp-crun.vxo10U.mount: Deactivated successfully. Nov 23 04:51:33 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:33 localhost ceph-mon[289735]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:51:33 localhost ceph-mon[289735]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:51:33 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:33 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:33 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:51:34 localhost ceph-mon[289735]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:51:34 localhost ceph-mon[289735]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:51:35 localhost nova_compute[281952]: 2025-11-23 09:51:35.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:35 localhost nova_compute[281952]: 2025-11-23 09:51:35.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:51:35 localhost ceph-mon[289735]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:51:35 localhost ceph-mon[289735]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:35 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:51:36 localhost ceph-mon[289735]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:51:36 localhost ceph-mon[289735]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:51:36 localhost ceph-mon[289735]: Saving service mon spec with placement label:mon Nov 23 04:51:36 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:36 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:36 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:36 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:51:36 localhost ceph-mon[289735]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:38 localhost ceph-mon[289735]: mon.np0005532585@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:51:38 localhost sshd[298644]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:51:38 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159600 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Nov 23 04:51:38 localhost ceph-mon[289735]: mon.np0005532585@2(peon) e11 removed from monmap, suicide. Nov 23 04:51:38 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb159080 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Nov 23 04:51:38 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x55eecb158f20 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Nov 23 04:51:38 localhost podman[298659]: 2025-11-23 09:51:38.737587005 +0000 UTC m=+0.053740169 container died 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, ceph=True, name=rhceph, io.openshift.expose-services=, version=7, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:38 localhost systemd[1]: tmp-crun.zznJAV.mount: Deactivated successfully. Nov 23 04:51:38 localhost systemd[1]: var-lib-containers-storage-overlay-d7e5200175f7b50f5204c53bc0525154e5a28b304a579162ed27351a6796afc1-merged.mount: Deactivated successfully. Nov 23 04:51:38 localhost podman[298659]: 2025-11-23 09:51:38.777547819 +0000 UTC m=+0.093700943 container remove 3181a32eddec18c5a28b6225f78da2b1d77c1a7c16c3ef6ab437e2c19f4ee803 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:51:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:51:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:51:39 localhost podman[298760]: 2025-11-23 09:51:39.410081912 +0000 UTC m=+0.144365375 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:51:39 localhost podman[298760]: 2025-11-23 09:51:39.420460532 +0000 UTC m=+0.154743955 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:51:39 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:51:39 localhost podman[298758]: 2025-11-23 09:51:39.508125336 +0000 UTC m=+0.246102972 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118) Nov 23 04:51:39 localhost podman[298758]: 2025-11-23 09:51:39.525238545 +0000 UTC m=+0.263216181 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:51:39 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:51:39 localhost systemd[1]: tmp-crun.AMozBn.mount: Deactivated successfully. Nov 23 04:51:39 localhost systemd[1]: ceph-46550e70-79cb-5f55-bf6d-1204b97e083b@mon.np0005532585.service: Deactivated successfully. Nov 23 04:51:39 localhost systemd[1]: Stopped Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 04:51:39 localhost systemd[1]: ceph-46550e70-79cb-5f55-bf6d-1204b97e083b@mon.np0005532585.service: Consumed 8.465s CPU time. Nov 23 04:51:39 localhost systemd[1]: Reloading. Nov 23 04:51:40 localhost systemd-rc-local-generator[298856]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:51:40 localhost systemd-sysv-generator[298860]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:40 localhost nova_compute[281952]: 2025-11-23 09:51:40.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:41 localhost systemd[1]: tmp-crun.ygnu2R.mount: Deactivated successfully. Nov 23 04:51:41 localhost podman[298980]: 2025-11-23 09:51:41.371098551 +0000 UTC m=+0.096035604 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, name=rhceph, architecture=x86_64) Nov 23 04:51:41 localhost podman[298980]: 2025-11-23 09:51:41.506291292 +0000 UTC m=+0.231228345 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55) Nov 23 04:51:41 localhost podman[240668]: time="2025-11-23T09:51:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:51:41 localhost podman[240668]: @ - - [23/Nov/2025:09:51:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151667 "" "Go-http-client/1.1" Nov 23 04:51:41 localhost podman[240668]: @ - - [23/Nov/2025:09:51:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18220 "" "Go-http-client/1.1" Nov 23 04:51:45 localhost nova_compute[281952]: 2025-11-23 09:51:45.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.281 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:51:50 localhost nova_compute[281952]: 2025-11-23 09:51:50.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:51:51 localhost sshd[299416]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:51:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:51:52 localhost systemd[1]: tmp-crun.13zmaT.mount: Deactivated successfully. Nov 23 04:51:52 localhost podman[299468]: 2025-11-23 09:51:52.441087047 +0000 UTC m=+0.099437898 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute) Nov 23 04:51:52 localhost podman[299468]: 2025-11-23 09:51:52.452302143 +0000 UTC m=+0.110652944 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute) Nov 23 04:51:52 localhost podman[299477]: Nov 23 04:51:52 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.473374404 +0000 UTC m=+0.098865402 container create ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:51:52 localhost systemd[1]: Started libpod-conmon-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope. Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.436348631 +0000 UTC m=+0.061839639 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:52 localhost systemd[1]: Started libcrun container. Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.55367307 +0000 UTC m=+0.179164048 container init ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55) Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.567562099 +0000 UTC m=+0.193053077 container start ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, version=7) Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.567858558 +0000 UTC m=+0.193349546 container attach ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, GIT_CLEAN=True, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, ceph=True, version=7, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Nov 23 04:51:52 localhost great_cartwright[299504]: 167 167 Nov 23 04:51:52 localhost systemd[1]: libpod-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope: Deactivated successfully. Nov 23 04:51:52 localhost podman[299477]: 2025-11-23 09:51:52.571648315 +0000 UTC m=+0.197139333 container died ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, maintainer=Guillaume Abrioux , name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=) Nov 23 04:51:52 localhost podman[299509]: 2025-11-23 09:51:52.666390168 +0000 UTC m=+0.085973803 container remove ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_cartwright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, release=553, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:51:52 localhost systemd[1]: libpod-conmon-ca057daed2b9a90232f4cb4665f701e6821cbb92050ce04b7620b72a7c425e8b.scope: Deactivated successfully. Nov 23 04:51:53 localhost podman[299579]: Nov 23 04:51:53 localhost systemd[1]: var-lib-containers-storage-overlay-23ae74c38d711b8b9a8ca0a30471e2bc5566d22040d2eaf932254cefbb30d7b4-merged.mount: Deactivated successfully. Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.428224941 +0000 UTC m=+0.082959610 container create 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, RELEASE=main, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:53 localhost systemd[1]: Started libpod-conmon-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope. Nov 23 04:51:53 localhost systemd[1]: Started libcrun container. Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.390860498 +0000 UTC m=+0.045595247 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.49269341 +0000 UTC m=+0.147428079 container init 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.503806592 +0000 UTC m=+0.158541261 container start 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=) Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.50406723 +0000 UTC m=+0.158801899 container attach 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, version=7, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:53 localhost sleepy_goldstine[299594]: 167 167 Nov 23 04:51:53 localhost systemd[1]: libpod-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope: Deactivated successfully. Nov 23 04:51:53 localhost podman[299579]: 2025-11-23 09:51:53.507367693 +0000 UTC m=+0.162102392 container died 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Nov 23 04:51:53 localhost podman[299599]: 2025-11-23 09:51:53.600190447 +0000 UTC m=+0.085439458 container remove 2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_goldstine, release=553, io.openshift.tags=rhceph ceph, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, GIT_BRANCH=main) Nov 23 04:51:53 localhost systemd[1]: libpod-conmon-2eace93fc46c74b09376c89992d986f3144ddf6c520f251ecfbe1588f1ac9aa7.scope: Deactivated successfully. Nov 23 04:51:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:51:54 localhost podman[299660]: 2025-11-23 09:51:54.275723847 +0000 UTC m=+0.083065884 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:51:54 localhost podman[299660]: 2025-11-23 09:51:54.291269616 +0000 UTC m=+0.098611603 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:51:54 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:51:54 localhost podman[299699]: Nov 23 04:51:54 localhost systemd[1]: tmp-crun.KnNkxO.mount: Deactivated successfully. Nov 23 04:51:54 localhost systemd[1]: var-lib-containers-storage-overlay-8f101ae6276431bad24401ca6c7eed7501adeb83fe683350b44a57b67c6504b2-merged.mount: Deactivated successfully. Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.427142608 +0000 UTC m=+0.078493702 container create c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public) Nov 23 04:51:54 localhost systemd[1]: Started libpod-conmon-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope. Nov 23 04:51:54 localhost systemd[1]: Started libcrun container. Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.394575193 +0000 UTC m=+0.045926357 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.506327181 +0000 UTC m=+0.157678275 container init c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, vcs-type=git, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55) Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.518613771 +0000 UTC m=+0.169964865 container start c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, com.redhat.component=rhceph-container, release=553, architecture=x86_64, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True) Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.518884559 +0000 UTC m=+0.170235673 container attach c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., release=553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True) Nov 23 04:51:54 localhost clever_mcclintock[299714]: 167 167 Nov 23 04:51:54 localhost systemd[1]: libpod-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope: Deactivated successfully. Nov 23 04:51:54 localhost podman[299699]: 2025-11-23 09:51:54.521593873 +0000 UTC m=+0.172944967 container died c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph) Nov 23 04:51:54 localhost podman[299719]: 2025-11-23 09:51:54.618392908 +0000 UTC m=+0.084296131 container remove c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_mcclintock, GIT_CLEAN=True, name=rhceph, architecture=x86_64, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553) Nov 23 04:51:54 localhost systemd[1]: libpod-conmon-c18181ccb1170a75f95380d6b7824458762eea3e39e229cbd6fbe2a2a69503e4.scope: Deactivated successfully. Nov 23 04:51:55 localhost nova_compute[281952]: 2025-11-23 09:51:55.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:51:55 localhost podman[299838]: Nov 23 04:51:55 localhost systemd[1]: var-lib-containers-storage-overlay-0a686a22fcbc80824f12418328d51472afb4a2c25dbb4a8c0bba9f01058edc01-merged.mount: Deactivated successfully. Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.423612151 +0000 UTC m=+0.067848995 container create 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64) Nov 23 04:51:55 localhost systemd[1]: Started libpod-conmon-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope. Nov 23 04:51:55 localhost systemd[1]: Started libcrun container. Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.482393774 +0000 UTC m=+0.126630648 container init 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12) Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.495192828 +0000 UTC m=+0.139429702 container start 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.495514718 +0000 UTC m=+0.139751592 container attach 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=553, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, version=7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Nov 23 04:51:55 localhost zealous_noyce[299859]: 167 167 Nov 23 04:51:55 localhost systemd[1]: libpod-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope: Deactivated successfully. Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.398556138 +0000 UTC m=+0.042793082 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:55 localhost podman[299838]: 2025-11-23 09:51:55.498072327 +0000 UTC m=+0.142309231 container died 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, distribution-scope=public, io.buildah.version=1.33.12, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.openshift.expose-services=) Nov 23 04:51:55 localhost podman[299864]: 2025-11-23 09:51:55.588979002 +0000 UTC m=+0.078344158 container remove 65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_noyce, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Nov 23 04:51:55 localhost systemd[1]: libpod-conmon-65d11b7488e1e99856c1ba3eaa00b41e58e4221c29b6190b1e25829c636c8593.scope: Deactivated successfully. Nov 23 04:51:55 localhost podman[299948]: Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.922405249 +0000 UTC m=+0.077471042 container create ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main) Nov 23 04:51:55 localhost systemd[1]: Started libpod-conmon-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope. Nov 23 04:51:55 localhost systemd[1]: Started libcrun container. Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.977458676 +0000 UTC m=+0.132524479 container init ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, version=7, io.buildah.version=1.33.12, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph) Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.986473155 +0000 UTC m=+0.141538948 container start ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.986806735 +0000 UTC m=+0.141872548 container attach ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, release=553, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Nov 23 04:51:55 localhost priceless_nightingale[299964]: 167 167 Nov 23 04:51:55 localhost systemd[1]: libpod-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope: Deactivated successfully. Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.990092017 +0000 UTC m=+0.145157860 container died ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, version=7, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:55 localhost podman[299948]: 2025-11-23 09:51:55.892287839 +0000 UTC m=+0.047353652 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:56 localhost podman[299969]: 2025-11-23 09:51:56.089189183 +0000 UTC m=+0.091948077 container remove ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_nightingale, version=7, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True) Nov 23 04:51:56 localhost systemd[1]: libpod-conmon-ab38be55198225b4573dce54e1f0819196d4f49fd4f8451560aef387c2b87236.scope: Deactivated successfully. Nov 23 04:51:56 localhost podman[299985]: Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.197153295 +0000 UTC m=+0.070319491 container create 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Nov 23 04:51:56 localhost systemd[1]: Started libpod-conmon-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope. Nov 23 04:51:56 localhost systemd[1]: Started libcrun container. Nov 23 04:51:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.252194112 +0000 UTC m=+0.125360328 container init 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.26022199 +0000 UTC m=+0.133388186 container start 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, GIT_CLEAN=True, ceph=True, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.260418996 +0000 UTC m=+0.133585212 container attach 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, architecture=x86_64, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, release=553) Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.172556666 +0000 UTC m=+0.045722892 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:56 localhost systemd[1]: libpod-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope: Deactivated successfully. Nov 23 04:51:56 localhost podman[299985]: 2025-11-23 09:51:56.359742981 +0000 UTC m=+0.232909207 container died 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, release=553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, maintainer=Guillaume Abrioux ) Nov 23 04:51:56 localhost systemd[1]: tmp-crun.7n1Igb.mount: Deactivated successfully. Nov 23 04:51:56 localhost systemd[1]: var-lib-containers-storage-overlay-0b66821f6b0214633642870aee82d6d400b109c83764b6fa0093be26ceba0836-merged.mount: Deactivated successfully. Nov 23 04:51:56 localhost systemd[1]: var-lib-containers-storage-overlay-721aca86e3f52d984b53caa7b23302e07b2baf6ff16b1aef6298de407cbbb65e-merged.mount: Deactivated successfully. Nov 23 04:51:56 localhost podman[300036]: 2025-11-23 09:51:56.465405201 +0000 UTC m=+0.090573536 container remove 8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_poitras, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:51:56 localhost systemd[1]: libpod-conmon-8830e5bd1cf312033cc57da2ca1f5b04f94d21aa9b1c9261c3189ebbd582bdf9.scope: Deactivated successfully. Nov 23 04:51:56 localhost systemd[1]: Reloading. Nov 23 04:51:56 localhost systemd-rc-local-generator[300076]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:51:56 localhost systemd-sysv-generator[300080]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:56 localhost systemd[1]: Reloading. Nov 23 04:51:57 localhost systemd-rc-local-generator[300116]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 23 04:51:57 localhost systemd-sysv-generator[300124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 23 04:51:57 localhost systemd[1]: Starting Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b... Nov 23 04:51:57 localhost podman[300181]: Nov 23 04:51:57 localhost podman[300181]: 2025-11-23 09:51:57.671227641 +0000 UTC m=+0.078558655 container create 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, RELEASE=main, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:51:57 localhost systemd[1]: tmp-crun.lXn0m3.mount: Deactivated successfully. Nov 23 04:51:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cccf77edc478feb73d30b00dc60701d2f9957b0673f6a3721dcda4ac313171de/merged/var/lib/ceph/mon/ceph-np0005532585 supports timestamps until 2038 (0x7fffffff) Nov 23 04:51:57 localhost podman[300181]: 2025-11-23 09:51:57.637949784 +0000 UTC m=+0.045280788 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:57 localhost podman[300181]: 2025-11-23 09:51:57.737869607 +0000 UTC m=+0.145200611 container init 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, version=7, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Nov 23 04:51:57 localhost systemd[1]: tmp-crun.UQmXQ3.mount: Deactivated successfully. Nov 23 04:51:57 localhost podman[300181]: 2025-11-23 09:51:57.753285162 +0000 UTC m=+0.160616176 container start 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mon-np0005532585, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12) Nov 23 04:51:57 localhost bash[300181]: 9e3a5ec92a4f096878390cbd9e2ae953970f8c699defd39b76a780c5b2350abf Nov 23 04:51:57 localhost systemd[1]: Started Ceph mon.np0005532585 for 46550e70-79cb-5f55-bf6d-1204b97e083b. Nov 23 04:51:57 localhost ceph-mon[300199]: set uid:gid to 167:167 (ceph:ceph) Nov 23 04:51:57 localhost ceph-mon[300199]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Nov 23 04:51:57 localhost ceph-mon[300199]: pidfile_write: ignore empty --pid-file Nov 23 04:51:57 localhost ceph-mon[300199]: load: jerasure load: lrc Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: RocksDB version: 7.9.2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Git sha 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: DB SUMMARY Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: DB Session ID: R30MDH64VRAWCJ1C6PRG Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: CURRENT file: CURRENT Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: IDENTITY file: IDENTITY Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005532585/store.db dir, Total Num: 0, files: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005532585/store.db: 000004.log size: 761 ; Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.error_if_exists: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.create_if_missing: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.paranoid_checks: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.env: 0x56515fa7d9e0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.fs: PosixFileSystem Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.info_log: 0x5651615bcd20 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_file_opening_threads: 16 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.statistics: (nil) Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.use_fsync: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_log_file_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.log_file_time_to_roll: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.keep_log_file_num: 1000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.recycle_log_file_num: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_fallocate: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_mmap_reads: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_mmap_writes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.use_direct_reads: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.create_missing_column_families: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.db_log_dir: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.wal_dir: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.table_cache_numshardbits: 6 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.advise_random_on_open: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.db_write_buffer_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.write_buffer_manager: 0x5651615cd540 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.use_adaptive_mutex: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.rate_limiter: (nil) Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.wal_recovery_mode: 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enable_thread_tracking: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enable_pipelined_write: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.unordered_write: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.row_cache: None Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.wal_filter: None Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_ingest_behind: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.two_write_queues: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.manual_wal_flush: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.wal_compression: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.atomic_flush: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.persist_stats_to_disk: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.log_readahead_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.best_efforts_recovery: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.allow_data_in_errors: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.db_host_id: __hostname__ Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enforce_single_del_contracts: true Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_background_jobs: 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_background_compactions: -1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_subcompactions: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.delayed_write_rate : 16777216 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_total_wal_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.stats_dump_period_sec: 600 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.stats_persist_period_sec: 600 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_open_files: -1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bytes_per_sync: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_readahead_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_background_flushes: -1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Compression algorithms supported: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kZSTD supported: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kXpressCompression supported: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kBZip2Compression supported: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kLZ4Compression supported: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kZlibCompression supported: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: #011kSnappyCompression supported: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: DMutex implementation: pthread_mutex_t Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.merge_operator: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_filter: None Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_filter_factory: None Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.sst_partitioner_factory: None Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.memtable_factory: SkipListFactory Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.table_factory: BlockBasedTable Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5651615bc980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5651615b9350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.write_buffer_size: 33554432 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_write_buffer_number: 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression: NoCompression Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression: Disabled Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.prefix_extractor: nullptr Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.num_levels: 7 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.window_bits: -14 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.level: 32767 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.strategy: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.enabled: false Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.target_file_size_base: 67108864 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.target_file_size_multiplier: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_base: 268435456 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.arena_block_size: 1048576 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.disable_auto_compactions: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.table_properties_collectors: Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.inplace_update_support: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.memtable_huge_page_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.bloom_locality: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.max_successive_merges: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.paranoid_file_checks: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.force_consistency_checks: 1 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.report_bg_io_stats: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.ttl: 2592000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enable_blob_files: false Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.min_blob_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_file_size: 268435456 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_compression_type: NoCompression Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.enable_blob_garbage_collection: false Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.blob_file_starting_level: 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005532585/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4d2c9233-e977-47c6-b4f9-0c301abf625f Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517810836, "job": 1, "event": "recovery_started", "wal_files": [4]} Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517813616, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891517813735, "job": 1, "event": "recovery_finished"} Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5651615e0e00 Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: DB pointer 0x5651616d6000 Nov 23 04:51:57 localhost ceph-mon[300199]: mon.np0005532585 does not exist in monmap, will attempt to join an existing cluster Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:51:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.84 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.84 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 6.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 23 04:51:57 localhost ceph-mon[300199]: using public_addr v2:172.18.0.104:0/0 -> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] Nov 23 04:51:57 localhost ceph-mon[300199]: starting mon.np0005532585 rank -1 at public addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] at bind addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005532585 fsid 46550e70-79cb-5f55-bf6d-1204b97e083b Nov 23 04:51:57 localhost ceph-mon[300199]: mon.np0005532585@-1(???) e0 preinit fsid 46550e70-79cb-5f55-bf6d-1204b97e083b Nov 23 04:51:57 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing) e11 sync_obtain_latest_monmap Nov 23 04:51:57 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing) e11 sync_obtain_latest_monmap obtained monmap e11 Nov 23 04:51:57 localhost podman[300242]: Nov 23 04:51:57 localhost podman[300242]: 2025-11-23 09:51:57.952816429 +0000 UTC m=+0.091043311 container create 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55) Nov 23 04:51:58 localhost systemd[1]: Started libpod-conmon-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope. Nov 23 04:51:58 localhost podman[300242]: 2025-11-23 09:51:57.914255799 +0000 UTC m=+0.052482711 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:51:58 localhost systemd[1]: Started libcrun container. Nov 23 04:51:58 localhost podman[300242]: 2025-11-23 09:51:58.040690989 +0000 UTC m=+0.178917861 container init 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , release=553, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main) Nov 23 04:51:58 localhost podman[300242]: 2025-11-23 09:51:58.053296658 +0000 UTC m=+0.191523570 container start 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, RELEASE=main) Nov 23 04:51:58 localhost podman[300242]: 2025-11-23 09:51:58.053760683 +0000 UTC m=+0.191987605 container attach 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, distribution-scope=public, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:51:58 localhost eager_thompson[300256]: 167 167 Nov 23 04:51:58 localhost systemd[1]: libpod-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope: Deactivated successfully. Nov 23 04:51:58 localhost podman[300242]: 2025-11-23 09:51:58.061120839 +0000 UTC m=+0.199347791 container died 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:51:58 localhost podman[300261]: 2025-11-23 09:51:58.135782053 +0000 UTC m=+0.065805412 container remove 67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_thompson, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 23 04:51:58 localhost systemd[1]: libpod-conmon-67720cd9c091cabf94a0976ebcc8fff4a96a19ff3862022eeb07c2b6b5a491dd.scope: Deactivated successfully. Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).mds e16 new map Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T08:00:26.486221+0000#012modified#0112025-11-23T09:47:19.846415+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26392}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26392 members: 26392#012[mds.mds.np0005532586.mfohsb{0:26392} state up:active seq 12 addr [v2:172.18.0.108:6808/2718449296,v1:172.18.0.108:6809/2718449296] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005532585.jcltnl{-1:17133} state up:standby seq 1 addr [v2:172.18.0.107:6808/563301557,v1:172.18.0.107:6809/563301557] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005532584.aoxjmw{-1:17139} state up:standby seq 1 addr [v2:172.18.0.106:6808/2261302276,v1:172.18.0.106:6809/2261302276] compat {c=[1],r=[1],i=[17ff]}] Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 3314933000852226048, adjusting msgr requires Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires Nov 23 04:51:58 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:51:58 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:51:58 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:58 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:51:58 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).paxosservice(auth 1..38) refresh upgraded, format 0 -> 3 Nov 23 04:51:59 localhost openstack_network_exporter[242668]: ERROR 09:51:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:51:59 localhost openstack_network_exporter[242668]: ERROR 09:51:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:51:59 localhost openstack_network_exporter[242668]: ERROR 09:51:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:51:59 localhost openstack_network_exporter[242668]: ERROR 09:51:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:51:59 localhost openstack_network_exporter[242668]: Nov 23 04:51:59 localhost openstack_network_exporter[242668]: ERROR 09:51:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:51:59 localhost openstack_network_exporter[242668]: Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.286 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.289 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:00 localhost nova_compute[281952]: 2025-11-23 09:52:00.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:52:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 4944 writes, 22K keys, 4944 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4944 writes, 665 syncs, 7.43 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 100 writes, 292 keys, 100 commit groups, 1.0 writes per commit group, ingest: 0.32 MB, 0.00 MB/s#012Interval WAL: 100 writes, 47 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:02 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:52:02 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:52:03 localhost systemd[1]: tmp-crun.zvmW8A.mount: Deactivated successfully. Nov 23 04:52:03 localhost podman[300295]: 2025-11-23 09:52:03.01630571 +0000 UTC m=+0.077751750 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:52:03 localhost podman[300302]: 2025-11-23 09:52:03.0341549 +0000 UTC m=+0.085382355 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:52:03 localhost podman[300302]: 2025-11-23 09:52:03.069713278 +0000 UTC m=+0.120940752 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container) Nov 23 04:52:03 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:52:03 localhost podman[300295]: 2025-11-23 09:52:03.103008755 +0000 UTC m=+0.164454805 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller) Nov 23 04:52:03 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:52:03 localhost podman[300296]: 2025-11-23 09:52:03.071252856 +0000 UTC m=+0.127817025 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:52:03 localhost podman[300296]: 2025-11-23 09:52:03.154237175 +0000 UTC m=+0.210801314 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118) Nov 23 04:52:03 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:52:03 localhost podman[300444]: 2025-11-23 09:52:03.859178113 +0000 UTC m=+0.100845783 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:52:03 localhost podman[300444]: 2025-11-23 09:52:03.975492921 +0000 UTC m=+0.217160580 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Nov 23 04:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 04:52:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5910 writes, 25K keys, 5910 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5910 writes, 864 syncs, 6.84 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 174 writes, 477 keys, 174 commit groups, 1.0 writes per commit group, ingest: 0.65 MB, 0.00 MB/s#012Interval WAL: 174 writes, 76 syncs, 2.29 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 04:52:05 localhost nova_compute[281952]: 2025-11-23 09:52:05.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:05 localhost nova_compute[281952]: 2025-11-23 09:52:05.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:05 localhost nova_compute[281952]: 2025-11-23 09:52:05.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:52:06 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:06 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:06 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:06 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.443 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.444 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.445 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.445 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.780 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.797 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.798 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.798 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.799 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:06 localhost nova_compute[281952]: 2025-11-23 09:52:06.799 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.267 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.268 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.269 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:52:07 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.729 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.795 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.795 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.966 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11705MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:52:07 localhost nova_compute[281952]: 2025-11-23 09:52:07.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.221 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.222 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.222 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.265 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:52:08 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.687 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.693 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.712 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.714 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:52:08 localhost nova_compute[281952]: 2025-11-23 09:52:08.715 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:52:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:52:09.289 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:52:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:52:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:52:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:52:09.290 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:52:09 localhost nova_compute[281952]: 2025-11-23 09:52:09.715 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:09 localhost nova_compute[281952]: 2025-11-23 09:52:09.715 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:52:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:52:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:52:10 localhost systemd[1]: tmp-crun.L8BmlU.mount: Deactivated successfully. Nov 23 04:52:10 localhost podman[300694]: 2025-11-23 09:52:10.0294073 +0000 UTC m=+0.084701704 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:52:10 localhost podman[300694]: 2025-11-23 09:52:10.044155956 +0000 UTC m=+0.099450400 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:52:10 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:52:10 localhost podman[300693]: 2025-11-23 09:52:10.127802006 +0000 UTC m=+0.183858583 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:52:10 localhost podman[300693]: 2025-11-23 09:52:10.168352697 +0000 UTC m=+0.224409244 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 23 04:52:10 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:52:10 localhost nova_compute[281952]: 2025-11-23 09:52:10.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:10 localhost nova_compute[281952]: 2025-11-23 09:52:10.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:11 localhost podman[240668]: time="2025-11-23T09:52:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:52:11 localhost podman[240668]: @ - - [23/Nov/2025:09:52:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:52:11 localhost podman[240668]: @ - - [23/Nov/2025:09:52:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18701 "" "Go-http-client/1.1" Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: Reconfig service osd.default_drive_group Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:12 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:52:12 localhost ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr handle_mgr_map Activating! Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr handle_mgr_map I am now activating Nov 23 04:52:12 localhost systemd[1]: session-66.scope: Deactivated successfully. Nov 23 04:52:12 localhost systemd[1]: session-66.scope: Consumed 23.454s CPU time. Nov 23 04:52:12 localhost ceph-mgr[288287]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: balancer Nov 23 04:52:12 localhost ceph-mgr[288287]: [balancer INFO root] Starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost systemd-logind[761]: Session 66 logged out. Waiting for processes to exit. Nov 23 04:52:12 localhost systemd-logind[761]: Removed session 66. Nov 23 04:52:12 localhost ceph-mgr[288287]: [balancer INFO root] Optimize plan auto_2025-11-23_09:52:12 Nov 23 04:52:12 localhost ceph-mgr[288287]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 23 04:52:12 localhost ceph-mgr[288287]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Nov 23 04:52:12 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:12 localhost ceph-mgr[288287]: [cephadm WARNING root] removing stray HostCache host record np0005532582.localdomain.devices.0 Nov 23 04:52:12 localhost ceph-mgr[288287]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005532582.localdomain.devices.0 Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: cephadm Nov 23 04:52:12 localhost ceph-mgr[288287]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: crash Nov 23 04:52:12 localhost ceph-mgr[288287]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: devicehealth Nov 23 04:52:12 localhost ceph-mgr[288287]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: iostat Nov 23 04:52:12 localhost ceph-mgr[288287]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: nfs Nov 23 04:52:12 localhost ceph-mgr[288287]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: orchestrator Nov 23 04:52:12 localhost ceph-mgr[288287]: [devicehealth INFO root] Starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: pg_autoscaler Nov 23 04:52:12 localhost ceph-mgr[288287]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: progress Nov 23 04:52:12 localhost ceph-mgr[288287]: [progress INFO root] Loading... Nov 23 04:52:12 localhost ceph-mgr[288287]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Nov 23 04:52:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] _maybe_adjust Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: [progress INFO root] Loaded OSDMap, ready. Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] recovery thread starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] starting setup Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: rbd_support Nov 23 04:52:12 localhost ceph-mgr[288287]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: restful Nov 23 04:52:12 localhost ceph-mgr[288287]: [restful INFO root] server_addr: :: server_port: 8003 Nov 23 04:52:12 localhost ceph-mgr[288287]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: status Nov 23 04:52:12 localhost ceph-mgr[288287]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: telemetry Nov 23 04:52:12 localhost ceph-mgr[288287]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 23 04:52:12 localhost ceph-mgr[288287]: [restful WARNING root] server not running: no certificate configured Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 23 04:52:12 localhost ceph-mgr[288287]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 23 04:52:12 localhost ceph-mgr[288287]: mgr load Constructed class from module: volumes Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.916+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.917+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e9a17b640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:12.918+0000 7f5e95972640 -1 client.0 error registering admin socket command: (17) File exists Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] PerfHandler: starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_task_task: vms, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_task_task: volumes, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_task_task: images, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_task_task: backups, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] TaskHandler: starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Nov 23 04:52:12 localhost ceph-mgr[288287]: [rbd_support INFO root] setup complete Nov 23 04:52:13 localhost sshd[300894]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:52:13 localhost systemd-logind[761]: New session 69 of user ceph-admin. Nov 23 04:52:13 localhost systemd[1]: Started Session 69 of User ceph-admin. Nov 23 04:52:13 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:14 localhost ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Bus STARTING Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Bus STARTING Nov 23 04:52:14 localhost podman[301015]: 2025-11-23 09:52:14.112464576 +0000 UTC m=+0.079393820 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=553, ceph=True, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7) Nov 23 04:52:14 localhost ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765 Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765 Nov 23 04:52:14 localhost podman[301015]: 2025-11-23 09:52:14.217063602 +0000 UTC m=+0.183992856 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:52:14 localhost ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150 Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150 Nov 23 04:52:14 localhost ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:52:14 localhost ceph-mgr[288287]: [cephadm INFO cherrypy.error] [23/Nov/2025:09:52:14] ENGINE Bus STARTED Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : [23/Nov/2025:09:52:14] ENGINE Bus STARTED Nov 23 04:52:14 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:15 localhost ceph-mgr[288287]: [devicehealth INFO root] Check health Nov 23 04:52:15 localhost nova_compute[281952]: 2025-11-23 09:52:15.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:15 localhost nova_compute[281952]: 2025-11-23 09:52:15.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:16 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Nov 23 04:52:16 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e84 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Nov 23 04:52:16 localhost ceph-mon[300199]: mon.np0005532585@-1(synchronizing).osd e85 e85: 6 total, 6 up, 6 in Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17301 172.18.0.106:0/4082313214' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:52:16 localhost ceph-mon[300199]: from='client.? 172.18.0.200:0/885747258' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: Activating manager daemon np0005532585.gzafiw Nov 23 04:52:16 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:52:16 localhost ceph-mon[300199]: Manager daemon np0005532585.gzafiw is now available Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"}]': finished Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532582.localdomain.devices.0"}]': finished Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532585.gzafiw/mirror_snapshot_schedule"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532585.gzafiw/trash_purge_schedule"} : dispatch Nov 23 04:52:16 localhost ceph-mon[300199]: removing stray HostCache host record np0005532582.localdomain.devices.0 Nov 23 04:52:16 localhost ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Bus STARTING Nov 23 04:52:16 localhost ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Serving on http://172.18.0.107:8765 Nov 23 04:52:16 localhost ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Serving on https://172.18.0.107:7150 Nov 23 04:52:16 localhost ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Client ('172.18.0.107', 35764) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:52:16 localhost ceph-mon[300199]: [23/Nov/2025:09:52:14] ENGINE Bus STARTED Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd/host:np0005532583", "name": "osd_memory_target"} : dispatch Nov 23 04:52:16 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:16 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO root] Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:16 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:17 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:17 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:17 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mgr.np0005532584.naxwxy 172.18.0.106:0/4210916137; not ready for session (expect reconnect) Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:18 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:19 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:19 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:19 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:19 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:19 localhost ceph-mgr[288287]: [progress INFO root] Completed event 521307b1-a7a2-414b-b092-fbb546dc0500 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:52:19 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:52:19 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.300 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:20 localhost nova_compute[281952]: 2025-11-23 09:52:20.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:20 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:20 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:20 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:52:20 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:20 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:52:20 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:52:20 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:52:20 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:20 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:52:20 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s Nov 23 04:52:20 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:52:20 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:52:21 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:21 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:21 localhost podman[301989]: Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.485597713 +0000 UTC m=+0.065055498 container create ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, release=553) Nov 23 04:52:21 localhost systemd[1]: Started libpod-conmon-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope. Nov 23 04:52:21 localhost systemd[1]: Started libcrun container. Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.540357112 +0000 UTC m=+0.119814887 container init ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_CLEAN=True) Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.548048639 +0000 UTC m=+0.127506384 container start ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , version=7, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container) Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.548171283 +0000 UTC m=+0.127629108 container attach ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Nov 23 04:52:21 localhost pedantic_engelbart[302004]: 167 167 Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.564797806 +0000 UTC m=+0.144255581 container died ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:52:21 localhost systemd[1]: libpod-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope: Deactivated successfully. Nov 23 04:52:21 localhost podman[301989]: 2025-11-23 09:52:21.466409981 +0000 UTC m=+0.045867766 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:52:21 localhost podman[302009]: 2025-11-23 09:52:21.627273143 +0000 UTC m=+0.054162102 container remove ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_engelbart, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7) Nov 23 04:52:21 localhost systemd[1]: libpod-conmon-ad8c8600567e1fcd0afa446cc638b465a0d8d6409b9ed2770920f4a2fe46e697.scope: Deactivated successfully. Nov 23 04:52:21 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:52:21 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:52:22 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:22 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:22 localhost podman[302086]: Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.457657852 +0000 UTC m=+0.077853554 container create 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True) Nov 23 04:52:22 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:52:22 localhost ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:52:22 localhost systemd[1]: Started libpod-conmon-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope. Nov 23 04:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:52:22 localhost systemd[1]: var-lib-containers-storage-overlay-9752fa5865112c512d3f7a5f95ecd2cb8913bae0943175cbf5dcbefa602d9cf4-merged.mount: Deactivated successfully. Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.42615442 +0000 UTC m=+0.046350172 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:52:22 localhost systemd[1]: Started libcrun container. Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.556843821 +0000 UTC m=+0.177039533 container init 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.569170612 +0000 UTC m=+0.189366324 container start 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., release=553, version=7) Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.569562444 +0000 UTC m=+0.189758206 container attach 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, version=7) Nov 23 04:52:22 localhost funny_mendeleev[302101]: 167 167 Nov 23 04:52:22 localhost systemd[1]: libpod-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope: Deactivated successfully. Nov 23 04:52:22 localhost podman[302102]: 2025-11-23 09:52:22.609791025 +0000 UTC m=+0.092845236 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:52:22 localhost podman[302086]: 2025-11-23 09:52:22.622686393 +0000 UTC m=+0.242882145 container died 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7) Nov 23 04:52:22 localhost podman[302102]: 2025-11-23 09:52:22.677182924 +0000 UTC m=+0.160237095 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:52:22 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:52:22 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s Nov 23 04:52:22 localhost podman[302117]: 2025-11-23 09:52:22.730067345 +0000 UTC m=+0.142774815 container remove 99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_mendeleev, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, RELEASE=main, version=7, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Nov 23 04:52:22 localhost systemd[1]: libpod-conmon-99a5cbd368161f45daf953ce38e31fe244f9e4af00836de9fd2184f71d87e89d.scope: Deactivated successfully. Nov 23 04:52:22 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:52:22 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:52:22 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:52:23 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:23 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:23 localhost systemd[1]: var-lib-containers-storage-overlay-ad075ef5d574535aa6ab2256e568bae2affa1278efe97261b9789d7ad7244932-merged.mount: Deactivated successfully. Nov 23 04:52:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:52:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:52:24 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:24 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44396 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 23 04:52:24 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:24 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:24 localhost ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:52:24 localhost ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:52:24 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Nov 23 04:52:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:52:25 localhost podman[302148]: 2025-11-23 09:52:25.0270715 +0000 UTC m=+0.080652010 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:52:25 localhost podman[302148]: 2025-11-23 09:52:25.038558964 +0000 UTC m=+0.092139454 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:52:25 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:52:25 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:25 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:25 localhost ceph-mgr[288287]: [progress INFO root] Completed event 2ad7aaa5-10dd-4472-80a4-9813541370c6 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.306 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.308 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.337 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:25 localhost nova_compute[281952]: 2025-11-23 09:52:25.338 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:25 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:25 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:26 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:26 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:26 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:26 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 23 04:52:26 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44402 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:52:26 localhost ceph-mgr[288287]: [cephadm INFO root] Saving service mon spec with placement label:mon Nov 23 04:52:26 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Nov 23 04:52:27 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:27 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:27 localhost ceph-mgr[288287]: [progress INFO root] Completed event 9beaf830-fe6f-4101-965a-6cb54d294b00 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:52:27 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:52:27 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:52:27 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:27 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:27 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:27 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:27 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:52:28 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:52:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:52:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:52:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:52:28 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:28 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:28 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:52:28 localhost ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: Saving service mon spec with placement label:mon Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:52:28 localhost ceph-mon[300199]: Reconfiguring mon.np0005532583 (monmap changed)... Nov 23 04:52:28 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.524024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548524149, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12105, "num_deletes": 294, "total_data_size": 21098157, "memory_usage": 22079120, "flush_reason": "Manual Compaction"} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548594790, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 19536601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12110, "table_properties": {"data_size": 19467576, "index_size": 39159, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28805, "raw_key_size": 305052, "raw_average_key_size": 26, "raw_value_size": 19268000, "raw_average_value_size": 1673, "num_data_blocks": 1501, "num_entries": 11513, "num_filter_entries": 11513, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 1763891517, "file_creation_time": 1763891548, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 70907 microseconds, and 36020 cpu microseconds. Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.594880) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 19536601 bytes OK Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.594971) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597460) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597493) EVENT_LOG_v1 {"time_micros": 1763891548597482, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.597518) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 21013716, prev total WAL file size 21013716, number of live WAL files 2. Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.601868) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323634' seq:72057594037927935, type:22 .. '6B760031353238' seq:0, type:0; will stop at (end) Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(18MB) 8(1887B)] Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548602007, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 19538488, "oldest_snapshot_seqno": -1} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11250 keys, 19533230 bytes, temperature: kUnknown Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548683464, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 19533230, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19464858, "index_size": 39151, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 301020, "raw_average_key_size": 26, "raw_value_size": 19268651, "raw_average_value_size": 1712, "num_data_blocks": 1500, "num_entries": 11250, "num_filter_entries": 11250, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891548, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.683850) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 19533230 bytes Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.685711) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 239.4 rd, 239.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(18.6, 0.0 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11518, records dropped: 268 output_compression: NoCompression Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.685743) EVENT_LOG_v1 {"time_micros": 1763891548685728, "job": 4, "event": "compaction_finished", "compaction_time_micros": 81608, "compaction_time_cpu_micros": 45489, "output_level": 6, "num_output_files": 1, "total_output_size": 19533230, "num_input_records": 11518, "num_output_records": 11250, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548689107, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891548689183, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 23 04:52:28 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:28.601731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:28 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:28 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 23 04:52:29 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:52:29 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:52:29 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:52:29 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:52:29 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:29 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:29 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:29 localhost openstack_network_exporter[242668]: ERROR 09:52:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:52:29 localhost openstack_network_exporter[242668]: ERROR 09:52:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:52:29 localhost openstack_network_exporter[242668]: ERROR 09:52:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:52:29 localhost openstack_network_exporter[242668]: ERROR 09:52:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:52:29 localhost openstack_network_exporter[242668]: Nov 23 04:52:29 localhost openstack_network_exporter[242668]: ERROR 09:52:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:52:29 localhost openstack_network_exporter[242668]: Nov 23 04:52:30 localhost nova_compute[281952]: 2025-11-23 09:52:30.339 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:30 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:30 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:30 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 23 04:52:31 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27030 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532585", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 23 04:52:31 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:31 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:32 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:32 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:52:32 localhost ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:52:32 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:52:32 localhost ceph-mon[300199]: Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:52:32 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:32 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:33 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:33 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:52:34 localhost podman[302209]: 2025-11-23 09:52:34.021816395 +0000 UTC m=+0.076825752 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 04:52:34 localhost podman[302209]: 2025-11-23 09:52:34.031267876 +0000 UTC m=+0.086277253 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 04:52:34 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:52:34 localhost podman[302208]: 2025-11-23 09:52:34.073974024 +0000 UTC m=+0.130175897 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2) Nov 23 04:52:34 localhost podman[302210]: 2025-11-23 09:52:34.137631317 +0000 UTC m=+0.187186966 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41) Nov 23 04:52:34 localhost podman[302208]: 2025-11-23 09:52:34.150380851 +0000 UTC m=+0.206582784 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3) Nov 23 04:52:34 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:52:34 localhost podman[302210]: 2025-11-23 09:52:34.171725659 +0000 UTC m=+0.221281358 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Nov 23 04:52:34 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:52:34 localhost systemd[1]: session-67.scope: Deactivated successfully. Nov 23 04:52:34 localhost systemd[1]: session-67.scope: Consumed 1.543s CPU time. Nov 23 04:52:34 localhost systemd-logind[761]: Session 67 logged out. Waiting for processes to exit. Nov 23 04:52:34 localhost systemd-logind[761]: Removed session 67. Nov 23 04:52:34 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:34 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:34 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.344 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.346 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.372 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:35 localhost nova_compute[281952]: 2025-11-23 09:52:35.373 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:35 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:35 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:36 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:36 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:36 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:37 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:37 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:38 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:38 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:38 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:39 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:39 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:39 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e11 handle_auth_request failed to assign global_id Nov 23 04:52:39 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27051 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532583", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 23 04:52:40 localhost nova_compute[281952]: 2025-11-23 09:52:40.374 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:40 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:40 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.562670) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560562739, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 391, "num_deletes": 254, "total_data_size": 231811, "memory_usage": 240056, "flush_reason": "Manual Compaction"} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560567549, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 230728, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12111, "largest_seqno": 12501, "table_properties": {"data_size": 228059, "index_size": 716, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7357, "raw_average_key_size": 20, "raw_value_size": 222415, "raw_average_value_size": 624, "num_data_blocks": 28, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891552, "oldest_key_time": 1763891552, "file_creation_time": 1763891560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 4904 microseconds, and 1295 cpu microseconds. Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.567587) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 230728 bytes OK Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.567605) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569290) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569302) EVENT_LOG_v1 {"time_micros": 1763891560569299, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569316) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 228880, prev total WAL file size 228962, number of live WAL files 2. Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569742) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(225KB)], [15(18MB)] Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560569767, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 19763958, "oldest_snapshot_seqno": -1} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11086 keys, 16905942 bytes, temperature: kUnknown Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560635393, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16905942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16840908, "index_size": 36197, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 298202, "raw_average_key_size": 26, "raw_value_size": 16649857, "raw_average_value_size": 1501, "num_data_blocks": 1373, "num_entries": 11086, "num_filter_entries": 11086, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.639686) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16905942 bytes Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.641185) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 300.7 rd, 257.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.6 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(158.9) write-amplify(73.3) OK, records in: 11606, records dropped: 520 output_compression: NoCompression Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.641207) EVENT_LOG_v1 {"time_micros": 1763891560641197, "job": 6, "event": "compaction_finished", "compaction_time_micros": 65732, "compaction_time_cpu_micros": 24383, "output_level": 6, "num_output_files": 1, "total_output_size": 16905942, "num_input_records": 11606, "num_output_records": 11086, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560641328, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891560643013, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.569687) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643133) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:52:40.643142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:52:40 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:52:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:52:41 localhost podman[302269]: 2025-11-23 09:52:41.042140145 +0000 UTC m=+0.096822448 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:52:41 localhost podman[302269]: 2025-11-23 09:52:41.052818505 +0000 UTC m=+0.107500778 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3) Nov 23 04:52:41 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:52:41 localhost systemd[1]: tmp-crun.qxhc1i.mount: Deactivated successfully. Nov 23 04:52:41 localhost podman[302270]: 2025-11-23 09:52:41.146514055 +0000 UTC m=+0.196652798 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:52:41 localhost podman[302270]: 2025-11-23 09:52:41.161238339 +0000 UTC m=+0.211377082 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:52:41 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:52:41 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:41 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.27057 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532583"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO root] Remove daemons mon.np0005532583 Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005532583 Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584']) Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584']) Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532583 from monmap... Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532583 from monmap... Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:52:41 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Nov 23 04:52:41 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Nov 23 04:52:41 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1518513680 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed5af7c00 0x55eed596e680 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-osd[31905]: --2- [v2:172.18.0.107:6800/1293390152,v1:172.18.0.107:6801/1293390152] >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55d7b9741000 0x55d7bb6a2100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1450606455 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed509d800 0x55eed48db700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/487837919 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed537a400 0x55eed5428b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-mgr[288287]: client.27018 ms_handle_reset on v2:172.18.0.108:3300/0 Nov 23 04:52:41 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1151827140 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed51f6800 0x55eed51fc580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1334229557 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x55eed537a800 0x55eed5429b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:52:41 localhost ceph-mgr[288287]: client.27015 ms_handle_reset on v2:172.18.0.108:3300/0 Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:41 localhost podman[240668]: time="2025-11-23T09:52:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:52:41 localhost podman[240668]: @ - - [23/Nov/2025:09:52:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:52:41 localhost podman[240668]: @ - - [23/Nov/2025:09:52:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1" Nov 23 04:52:42 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:42 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (2) No such file or directory Nov 23 04:52:42 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:42 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:42 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:52:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:52:43 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:43 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:44 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 23 04:52:44 localhost systemd[297934]: Activating special unit Exit the Session... Nov 23 04:52:44 localhost systemd[297934]: Stopped target Main User Target. Nov 23 04:52:44 localhost systemd[297934]: Stopped target Basic System. Nov 23 04:52:44 localhost systemd[297934]: Stopped target Paths. Nov 23 04:52:44 localhost systemd[297934]: Stopped target Sockets. Nov 23 04:52:44 localhost systemd[297934]: Stopped target Timers. Nov 23 04:52:44 localhost systemd[297934]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 23 04:52:44 localhost systemd[297934]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 04:52:44 localhost systemd[297934]: Closed D-Bus User Message Bus Socket. Nov 23 04:52:44 localhost systemd[297934]: Stopped Create User's Volatile Files and Directories. Nov 23 04:52:44 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:44 localhost systemd[297934]: Removed slice User Application Slice. Nov 23 04:52:44 localhost systemd[297934]: Reached target Shutdown. Nov 23 04:52:44 localhost systemd[297934]: Finished Exit the Session. Nov 23 04:52:44 localhost systemd[297934]: Reached target Exit the Session. Nov 23 04:52:44 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:44 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 23 04:52:44 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 23 04:52:44 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 23 04:52:44 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 23 04:52:44 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 23 04:52:44 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 23 04:52:44 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 23 04:52:44 localhost systemd[1]: user-1003.slice: Consumed 2.080s CPU time. Nov 23 04:52:44 localhost ceph-mon[300199]: mon.np0005532585@-1(probing) e13 my rank is now 2 (was -1) Nov 23 04:52:44 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:52:44 localhost ceph-mon[300199]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 Nov 23 04:52:44 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:44 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.378 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:52:45 localhost nova_compute[281952]: 2025-11-23 09:52:45.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:45 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:45 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:46 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:46 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:46 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:47 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:47 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:47 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e13 handle_auth_request failed to assign global_id Nov 23 04:52:47 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4)) Nov 23 04:52:47 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:47 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:47 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:52:47 localhost ceph-mgr[288287]: [cephadm INFO root] Removed label mon from host np0005532583.localdomain Nov 23 04:52:47 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label mon from host np0005532583.localdomain Nov 23 04:52:48 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:48 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:48 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:49 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:49 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532585: (22) Invalid argument Nov 23 04:52:49 localhost ceph-mon[300199]: Remove daemons mon.np0005532583 Nov 23 04:52:49 localhost ceph-mon[300199]: Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584'] (from ['np0005532586', 'np0005532584']) Nov 23 04:52:49 localhost ceph-mon[300199]: Removing monitor np0005532583 from monmap... Nov 23 04:52:49 localhost ceph-mon[300199]: Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584 in quorum (ranks 0,1) Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:49 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:52:49 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:49 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:49 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584 in quorum (ranks 0,1) Nov 23 04:52:49 localhost ceph-mon[300199]: Health check failed: 1/3 mons down, quorum np0005532586,np0005532584 (MON_DOWN) Nov 23 04:52:49 localhost ceph-mon[300199]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005532586,np0005532584 Nov 23 04:52:49 localhost ceph-mon[300199]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005532586,np0005532584 Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:52:49 localhost ceph-mon[300199]: Deploying daemon mon.np0005532583 on np0005532583.localdomain Nov 23 04:52:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:49 localhost ceph-mon[300199]: Removed label mon from host np0005532583.localdomain Nov 23 04:52:49 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:49 localhost ceph-mon[300199]: mgrc update_daemon_metadata mon.np0005532585 metadata {addrs=[v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005532585.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005532585.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:52:49 localhost ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585 in quorum (ranks 0,1,2) Nov 23 04:52:49 localhost ceph-mon[300199]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005532586,np0005532584) Nov 23 04:52:49 localhost ceph-mon[300199]: Cluster is now healthy Nov 23 04:52:49 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:52:50 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 23 04:52:50 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 23 04:52:50 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4)) Nov 23 04:52:50 localhost ceph-mgr[288287]: [progress INFO root] Completed event 9e93c160-06b9-44cd-a955-a2657d2a9665 (Updating mon deployment (+1 -> 4)) in 3 seconds Nov 23 04:52:50 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:50 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:52:50 localhost ceph-mgr[288287]: [progress INFO root] Completed event 789950ff-e8d9-41e1-82ae-3a055add8a86 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:52:50 localhost nova_compute[281952]: 2025-11-23 09:52:50.381 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:50 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532585 172.18.0.107:0/2529466979; not ready for session (expect reconnect) Nov 23 04:52:50 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 23 04:52:50 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:50 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (2) No such file or directory Nov 23 04:52:50 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:50 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:52:50 localhost ceph-mon[300199]: paxos.2).electionLogic(54) init, last seen epoch 54 Nov 23 04:52:50 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:50 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:51 localhost ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532585 Nov 23 04:52:51 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:51.428+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532585 Nov 23 04:52:51 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:51 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:51 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id Nov 23 04:52:52 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id Nov 23 04:52:52 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id Nov 23 04:52:52 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:52 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:52 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:52 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:52:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:52:53 localhost podman[302650]: 2025-11-23 09:52:53.032188055 +0000 UTC m=+0.082304200 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 04:52:53 localhost podman[302650]: 2025-11-23 09:52:53.041204183 +0000 UTC m=+0.091320368 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:52:53 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:52:53 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 handle_auth_request failed to assign global_id Nov 23 04:52:53 localhost sshd[302670]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:52:53 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:53 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:54 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:54 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:54 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:55 localhost nova_compute[281952]: 2025-11-23 09:52:55.385 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:52:55 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:55 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532583: (22) Invalid argument Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532583 calling monitor election Nov 23 04:52:55 localhost ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585,np0005532583 in quorum (ranks 0,1,2,3) Nov 23 04:52:55 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:52:55 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:55 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:55 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:52:55 localhost podman[302690]: 2025-11-23 09:52:55.907456407 +0000 UTC m=+0.087692606 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:52:55 localhost podman[302690]: 2025-11-23 09:52:55.923950116 +0000 UTC m=+0.104186305 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:52:55 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:52:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e14 handle_auth_request failed to assign global_id Nov 23 04:52:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.44437 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:52:56 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532583 172.18.0.105:0/230996147; not ready for session (expect reconnect) Nov 23 04:52:56 localhost ceph-mgr[288287]: [cephadm INFO root] Removed label mgr from host np0005532583.localdomain Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005532583.localdomain Nov 23 04:52:56 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:56 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:56 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:57 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3)) Nov 23 04:52:57 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765] Nov 23 04:52:57 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765] Nov 23 04:52:57 localhost ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532583 Nov 23 04:52:57 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:52:57.603+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532583 Nov 23 04:52:57 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1019640621 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:52:57 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54102 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005532583.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:52:57 localhost ceph-mgr[288287]: [cephadm INFO root] Removed label _admin from host np0005532583.localdomain Nov 23 04:52:57 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005532583.localdomain Nov 23 04:52:57 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:57 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:52:57 localhost ceph-mon[300199]: Removed label mgr from host np0005532583.localdomain Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:57 localhost ceph-mon[300199]: Removing daemon mgr.np0005532583.orhywt from np0005532583.localdomain -- ports [8765] Nov 23 04:52:58 localhost sshd[303015]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:52:58 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:52:58 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:52:58 localhost ceph-mon[300199]: Removed label _admin from host np0005532583.localdomain Nov 23 04:52:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005532583.orhywt Nov 23 04:52:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005532583.orhywt Nov 23 04:52:59 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3)) Nov 23 04:52:59 localhost ceph-mgr[288287]: [progress INFO root] Completed event b9da0499-3f45-4785-8af4-dab2f930c5d1 (Updating mgr deployment (-1 -> 3)) in 2 seconds Nov 23 04:52:59 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3)) Nov 23 04:52:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585']) Nov 23 04:52:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585']) Nov 23 04:52:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532583 from monmap... Nov 23 04:52:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532583 from monmap... Nov 23 04:52:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:52:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:52:59 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:52:59 localhost ceph-mon[300199]: paxos.2).electionLogic(58) init, last seen epoch 58 Nov 23 04:52:59 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:59 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:59 localhost ceph-mon[300199]: mon.np0005532585@2(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:59 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:52:59 localhost openstack_network_exporter[242668]: ERROR 09:52:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:52:59 localhost openstack_network_exporter[242668]: ERROR 09:52:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:52:59 localhost openstack_network_exporter[242668]: ERROR 09:52:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:52:59 localhost openstack_network_exporter[242668]: ERROR 09:52:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:52:59 localhost openstack_network_exporter[242668]: Nov 23 04:52:59 localhost openstack_network_exporter[242668]: ERROR 09:52:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:52:59 localhost openstack_network_exporter[242668]: Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 04:53:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 04:53:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/636913553' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 04:53:00 localhost nova_compute[281952]: 2025-11-23 09:53:00.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:00 localhost ceph-mon[300199]: Removing key for mgr.np0005532583.orhywt Nov 23 04:53:00 localhost ceph-mon[300199]: Safe to remove mon.np0005532583: new quorum should be ['np0005532586', 'np0005532584', 'np0005532585'] (from ['np0005532586', 'np0005532584', 'np0005532585']) Nov 23 04:53:00 localhost ceph-mon[300199]: Removing monitor np0005532583 from monmap... Nov 23 04:53:00 localhost ceph-mon[300199]: Removing daemon mon.np0005532583 from np0005532583.localdomain -- ports [] Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532586 is new leader, mons np0005532586,np0005532584,np0005532585 in quorum (ranks 0,1,2) Nov 23 04:53:00 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:53:00 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:53:00 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:53:00 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:01 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3)) Nov 23 04:53:01 localhost ceph-mgr[288287]: [progress INFO root] Completed event 577fc77b-eab4-40d2-9e0b-126d92a62b09 (Updating mon deployment (-1 -> 3)) in 2 seconds Nov 23 04:53:01 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:53:01 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:53:01 localhost ceph-mgr[288287]: [progress INFO root] Completed event 29a52c30-22cc-4956-a3b3-845bf23eb769 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:53:01 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:01 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:01 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:02 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020047544 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:53:02 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:53:02 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:53:03 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:03 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:03 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:03 localhost ceph-mon[300199]: Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:03 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:03 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:03 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:03 localhost ceph-mon[300199]: Removing np0005532583.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:53:03 localhost ceph-mon[300199]: Removing np0005532583.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:53:03 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:03 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:53:04 localhost podman[303355]: 2025-11-23 09:53:04.19034269 +0000 UTC m=+0.090574585 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 04:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:53:04 localhost nova_compute[281952]: 2025-11-23 09:53:04.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:04 localhost podman[303355]: 2025-11-23 09:53:04.226227078 +0000 UTC m=+0.126459033 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:53:04 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:53:04 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:53:04 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4)) Nov 23 04:53:04 localhost ceph-mgr[288287]: [progress INFO root] Completed event 24c6c7df-b5c3-4b98-be65-f132a49e64bd (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 23 04:53:04 localhost podman[303372]: 2025-11-23 09:53:04.309327081 +0000 UTC m=+0.101598045 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 23 04:53:04 localhost systemd[1]: tmp-crun.2XLYxt.mount: Deactivated successfully. Nov 23 04:53:04 localhost podman[303373]: 2025-11-23 09:53:04.394513299 +0000 UTC m=+0.183919225 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 04:53:04 localhost podman[303372]: 2025-11-23 09:53:04.400861935 +0000 UTC m=+0.193132859 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:53:04 localhost podman[303373]: 2025-11-23 09:53:04.411324668 +0000 UTC m=+0.200730564 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Nov 23 04:53:04 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:53:04 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:53:04 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:53:04 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:53:04 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:53:04 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:53:04 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:05 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:05 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:05 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:05 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532583.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:05 localhost nova_compute[281952]: 2025-11-23 09:53:05.230 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:05 localhost nova_compute[281952]: 2025-11-23 09:53:05.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:05 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:05 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:05 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:05 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:05 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:53:06 localhost ceph-mon[300199]: Reconfiguring crash.np0005532583 (monmap changed)... Nov 23 04:53:06 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532583 on np0005532583.localdomain Nov 23 04:53:06 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:06 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:06 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:06 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.500 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.501 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.501 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.502 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:53:06 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:06 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:06 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:06 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:06 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:06 localhost nova_compute[281952]: 2025-11-23 09:53:06.986 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.001 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.002 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.003 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.004 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 04:53:07 localhost nova_compute[281952]: 2025-11-23 09:53:07.020 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 04:53:07 localhost ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:07 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:07 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:07 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:07 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:53:07 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:07 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:07 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:07 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:07 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054607 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:08 localhost nova_compute[281952]: 2025-11-23 09:53:08.016 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:08 localhost nova_compute[281952]: 2025-11-23 09:53:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:08 localhost ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:08 localhost ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:08 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:08 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:08 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:53:08 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:08 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:08 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:08 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:08 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.241 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:53:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:53:09.291 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:53:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:53:09.291 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:53:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:53:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:53:09 localhost ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:09 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:09 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:09 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:09 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:09 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.34497 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005532583.localdomain", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:09 localhost ceph-mgr[288287]: [cephadm INFO root] Added label _no_schedule to host np0005532583.localdomain Nov 23 04:53:09 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005532583.localdomain Nov 23 04:53:09 localhost ceph-mgr[288287]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain Nov 23 04:53:09 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain Nov 23 04:53:09 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:09 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:09 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:53:09 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1028998822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:53:09 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:09 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.706 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.777 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.778 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.992 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.993 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11665MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.994 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:53:09 localhost nova_compute[281952]: 2025-11-23 09:53:09.994 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.166 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.167 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.167 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.376 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:10 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:10 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:10 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:10 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:10 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:10 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:10 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:10 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.807 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.808 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.811 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f4ed9c-0995-4b03-b3ea-dc7fd153d98f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.808442', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '386e83a0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '30bf39fa25edec39fdabe77d5901037814038b1c81939aabcdd801b063df3823'}]}, 'timestamp': '2025-11-23 09:53:10.812847', '_unique_id': '4622551b5cc94ac1945f250f70eb4fea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.814 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:53:10 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:53:10 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3035427830' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.834 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.841 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e78e6cc1-314c-4b5e-8f81-cb983af9867c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.815757', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38736eba-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'a324e1610bb2f64bc18f4bcafa9abad6db1d2276d7dc92e70f0aef1220c7aa9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.815757', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38738422-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '0e5ffe4a9e30b10c9ca688bddfad946d57b0798b7639e0a48d41e89fc64e0b72'}]}, 'timestamp': '2025-11-23 09:53:10.845563', '_unique_id': '1e34a06bae194c149e7968b3f478603d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd873577e-aa58-4889-8c1b-31101ab82b73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.848970', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38741be4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '2884c44b5650ed78043411fceab1611b1206c7b85e63b6f18e6877475a2aee39'}]}, 'timestamp': '2025-11-23 09:53:10.849486', '_unique_id': 'ca0943d2368d4d06ac8c8d7f222f7757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9f8b86b-1f7d-4570-9ae2-3f03e988ac02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.851729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38748796-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '2cdb915e3f9ee9ab02c446481959ea4a964226ba54fdbab5f7701111b2ba4f4f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.851729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38749812-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'd5c74cb75117085aad28392c3dbd3744881bb85b7d681605a4024dbb854fa40f'}]}, 'timestamp': '2025-11-23 09:53:10.852613', '_unique_id': '19cdce8e606340f8a9357607810e681f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.854 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.862 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.865 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:53:10 localhost nova_compute[281952]: 2025-11-23 09:53:10.865 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5188e5d0-3be0-4d87-94c5-6bf5b9188149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.854837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '38767ed4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'dd7e5a02baa997ce9ad71a9f9145f435ce7ae3fa488637c20d9414771bc269cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.854837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '38769216-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'bd91e5596d4b776b4cdd28eb20f795fedfb3ce90bce9d957fac116697c9165d6'}]}, 'timestamp': '2025-11-23 09:53:10.865583', '_unique_id': '11b98da210a84f1d877100b003b0f3ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db6ad8e4-f74e-446d-9194-d5a369c92b8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.868302', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38770f7a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'b1094e5c30b8174733a73694c88f85b793d02ccfbc0b32c07ce66f422695e675'}]}, 'timestamp': '2025-11-23 09:53:10.868807', '_unique_id': '32c575f10aed43c9969909ef95707d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da3891a4-b3cf-4897-850b-949f77faf6ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.871119', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38777cda-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'a03d46fd8c0e10440c0a0cf46f9152f46174a0c52479932fa297e7df9a08ed06'}]}, 'timestamp': '2025-11-23 09:53:10.871602', '_unique_id': '8b45316ecba442c7b38b7d1aebb973cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a78165c-df71-4b49-84d5-9be1eaf1c52c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.873808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3877e698-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'a77557c3bf8eaefadcfdbb32cc33d93e30b04f9d8a09a044b4efe07dac290722'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.873808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3877f8ea-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'd2c613e824e6353dcb84c9d6961b1cb7d247d986366cd050eb1a8a3cad79dc62'}]}, 'timestamp': '2025-11-23 09:53:10.874760', '_unique_id': '25c0933b1bb1420583145896ae86dd2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89af889e-66bc-4b5d-8550-3b0325883607', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.876959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387860d2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '3dc4dbd56859afcb267a38dae79a6971b9bca4e1920d7b85161b64a3231aa587'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.876959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387870fe-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '030d0b40687c5fd38eab705a02f520a24124be9c57d0b051ccf45adf30aa04ca'}]}, 'timestamp': '2025-11-23 09:53:10.877852', '_unique_id': '7fdd844d51cb4ec48e2c0548b4374674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8975f1a-a5c4-4da6-8569-45066d91c3ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.880104', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '3878dbd4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'c9d65427ceb7ead57f433076989073ad31ac31a153aa987c2b2f2568ade5c4e4'}]}, 'timestamp': '2025-11-23 09:53:10.880588', '_unique_id': '39fe9c702c394858adc95a281a3f088c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4208e150-fa33-4668-a835-e915dcd52b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.882973', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '38794d1c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '9022ab766ab6839248f677060feabcbbbab11396a02d347de317f6d3fe416951'}]}, 'timestamp': '2025-11-23 09:53:10.883515', '_unique_id': 'e7c79dba9b1e47aca0121700f05ee50b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52176427-62b2-4abf-ae33-875053584f88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.885675', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '3879b57c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'a86b9f69a46739d523a37178a5dc9a25abd2205b3bbea2e71115b7fb317183fc'}]}, 'timestamp': '2025-11-23 09:53:10.886158', '_unique_id': '460154ce56a2465092033192f157c684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c3931f7-5f0a-41cc-9034-b6d4e061a65f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.888296', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387a1bfc-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '6f63ee8b083afc942a2a141a33d0135b6b2263b787d7920f9cf2da7571f0a654'}]}, 'timestamp': '2025-11-23 09:53:10.888781', '_unique_id': '28312eb738f943b5909f82ca902fbecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.889 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.890 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.890 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b226306-9d03-4eef-a61b-a6941107c0b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.890936', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387a8236-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': '84309b21b360080aab6adffa05ceaf7b2049dc0ba533f46cf18248a73c4bca76'}]}, 'timestamp': '2025-11-23 09:53:10.891395', '_unique_id': '4c9dee7978744d19b70e585550edece1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ead07db-8d23-4ead-845c-db5e118f49fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:53:10.893475', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '387d4cb4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.086825964, 'message_signature': '941cc42abe6d75bcc2fd9c13dbf8277a71f1892fd51a4ad4d864338b8b7d33de'}]}, 'timestamp': '2025-11-23 09:53:10.909599', '_unique_id': 'dfbbeb39d83d49bb91d5b1a0eeb8447f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a280f66-1321-4f4a-9775-f9c4517b61f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:53:10.911099', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '387d9232-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.986071776, 'message_signature': 'cd0b542f3df869fee767a005836d67dbaa038ee43b60dad3990a8c8a7fd3cec4'}]}, 'timestamp': '2025-11-23 09:53:10.911381', '_unique_id': '3463bded98834fa5a56cb87ba34b369b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec3ecdbe-dac5-4537-a45d-519cc4c7fc3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.912673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387dd008-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'b950bcd03f8ce95fabe1ef444242d8892ab03af0582cd435622e0f58ec350cf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.912673', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387ddc2e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'c5d9dabd5b36c933bea6c6d9bf58a174c6eb267df893c326cf1bda1fd423b1b4'}]}, 'timestamp': '2025-11-23 09:53:10.913259', '_unique_id': 'e716de5561de4b77b974b48ef82ee461'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97ed12af-b875-4388-9ec5-878488df66a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.914600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387e1afe-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': 'badd910b44b6e7349a0d24c6b1d6dc925906407f1cb068a38d93337b1c3a68a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.914600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387e259e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '1f88be1e0a0461a2edfc240a645870a05d3b106d73ba20a8011d327568c12dc4'}]}, 'timestamp': '2025-11-23 09:53:10.915138', '_unique_id': '93de40822e1541ebb93c787f88bde087'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 14100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad0ca47-560b-4bba-a964-c792ea69964a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14100000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:53:10.916714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '387e6d7e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.086825964, 'message_signature': '0d810f7dddd362c9fd89e83c1483d41f816c08a04513bb1096f6eb4e8dce92d7'}]}, 'timestamp': '2025-11-23 09:53:10.917010', '_unique_id': 'f076422e836e4d76b2a5eb45f40ab351'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6df4f655-e0b2-4903-ae58-e8c45cd810ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.918308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387eabd6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '4331a7b69115f97aab11b3c2e8068def4441360e356eeccc24d622512e3dda3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.918308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387eb6da-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11533.993387112, 'message_signature': '29e3ace2baaaa34888bfc9f4637128749c1288f69becd8d51cf91fb671525248'}]}, 'timestamp': '2025-11-23 09:53:10.918857', '_unique_id': '094f4724c7974efb94217bbccda9909a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51444e12-bf01-4639-8a34-2051cb59f3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:53:10.920316', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '387efa3c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': '699380721f6945cdcd48803a1929dde43f49bd3eddf1b184ea6bc739d7e68d4a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:53:10.920316', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '387f0428-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11534.032509299, 'message_signature': 'bd67b959a5ef882798780e3de686fedd929301a03cfd66dabadb4559761a0def'}]}, 'timestamp': '2025-11-23 09:53:10.920833', '_unique_id': 'd7e208350c1243c285a48714ace55e1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:53:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:53:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 04:53:11 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:53:11 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:53:11 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:53:11 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:53:11 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54137 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005532583.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 23 04:53:11 localhost ceph-mon[300199]: Added label _no_schedule to host np0005532583.localdomain Nov 23 04:53:11 localhost ceph-mon[300199]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005532583.localdomain Nov 23 04:53:11 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:11 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:11 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:11 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:11 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:53:11 localhost nova_compute[281952]: 2025-11-23 09:53:11.866 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:11 localhost nova_compute[281952]: 2025-11-23 09:53:11.867 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:53:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:53:11 localhost podman[240668]: time="2025-11-23T09:53:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:53:11 localhost podman[240668]: @ - - [23/Nov/2025:09:53:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:53:11 localhost podman[240668]: @ - - [23/Nov/2025:09:53:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18721 "" "Go-http-client/1.1" Nov 23 04:53:11 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:11 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:11 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:11 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:12 localhost podman[303481]: 2025-11-23 09:53:12.016294217 +0000 UTC m=+0.124502113 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd) Nov 23 04:53:12 localhost podman[303481]: 2025-11-23 09:53:12.027387729 +0000 UTC m=+0.135595665 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd) Nov 23 04:53:12 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:53:12 localhost podman[303482]: 2025-11-23 09:53:11.9813994 +0000 UTC m=+0.089595525 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:53:12 localhost podman[303482]: 2025-11-23 09:53:12.111194654 +0000 UTC m=+0.219390789 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:53:12 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:53:12 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54143 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005532583.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:12 localhost ceph-mgr[288287]: [cephadm INFO root] Removed host np0005532583.localdomain Nov 23 04:53:12 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removed host np0005532583.localdomain Nov 23 04:53:12 localhost ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:53:12 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:53:12 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:12 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:12 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:12 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:12 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain"} : dispatch Nov 23 04:53:12 localhost podman[303573]: Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.556300276 +0000 UTC m=+0.070103014 container create c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main) Nov 23 04:53:12 localhost systemd[1]: Started libpod-conmon-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope. Nov 23 04:53:12 localhost systemd[1]: Started libcrun container. Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.53049302 +0000 UTC m=+0.044295788 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.633394234 +0000 UTC m=+0.147196982 container init c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.33.12) Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.644509837 +0000 UTC m=+0.158312575 container start c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , release=553, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64) Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.64492295 +0000 UTC m=+0.158725718 container attach c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=553, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, version=7) Nov 23 04:53:12 localhost systemd[1]: libpod-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope: Deactivated successfully. Nov 23 04:53:12 localhost agitated_fermi[303588]: 167 167 Nov 23 04:53:12 localhost podman[303573]: 2025-11-23 09:53:12.650504092 +0000 UTC m=+0.164306860 container died c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., ceph=True, RELEASE=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , release=553, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Nov 23 04:53:12 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:12 localhost ceph-mgr[288287]: [balancer INFO root] Optimize plan auto_2025-11-23_09:53:12 Nov 23 04:53:12 localhost ceph-mgr[288287]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 23 04:53:12 localhost ceph-mgr[288287]: [balancer INFO root] do_upmap Nov 23 04:53:12 localhost ceph-mgr[288287]: [balancer INFO root] pools ['backups', 'vms', 'volumes', 'images', 'manila_metadata', '.mgr', 'manila_data'] Nov 23 04:53:12 localhost ceph-mgr[288287]: [balancer INFO root] prepared 0/10 changes Nov 23 04:53:12 localhost podman[303593]: 2025-11-23 09:53:12.754935384 +0000 UTC m=+0.091559185 container remove c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_fermi, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:53:12 localhost systemd[1]: libpod-conmon-c6720eba0b80bc2cb48622b1c55b4aea18b7a44a74fdfa7182e4388efa140925.scope: Deactivated successfully. Nov 23 04:53:12 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:12 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:12 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:12 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:12 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] _maybe_adjust Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033244564838079286 of space, bias 1.0, pg target 0.6648912967615858 quantized to 32 (current 32) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 23 04:53:12 localhost ceph-mgr[288287]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:12 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: images, start_after= Nov 23 04:53:12 localhost ceph-mgr[288287]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 23 04:53:12 localhost systemd[1]: var-lib-containers-storage-overlay-b937137decf7b11ca88ddfa440c5cceb6d2d6afe5a265f025f5d493d2b31e3e4-merged.mount: Deactivated successfully. Nov 23 04:53:13 localhost nova_compute[281952]: 2025-11-23 09:53:13.210 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:13 localhost nova_compute[281952]: 2025-11-23 09:53:13.231 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:13 localhost nova_compute[281952]: 2025-11-23 09:53:13.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 04:53:13 localhost podman[303664]: Nov 23 04:53:13 localhost ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:13 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:13 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain"}]': finished Nov 23 04:53:13 localhost ceph-mon[300199]: Removed host np0005532583.localdomain Nov 23 04:53:13 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:13 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:13 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.462164933 +0000 UTC m=+0.080995040 container create 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:13 localhost systemd[1]: Started libpod-conmon-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope. Nov 23 04:53:13 localhost systemd[1]: Started libcrun container. Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.429370201 +0000 UTC m=+0.048200388 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.534939898 +0000 UTC m=+0.153770005 container init 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.54085376 +0000 UTC m=+0.159683837 container start 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.541147409 +0000 UTC m=+0.159977556 container attach 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, release=553, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:13 localhost heuristic_napier[303679]: 167 167 Nov 23 04:53:13 localhost systemd[1]: libpod-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope: Deactivated successfully. Nov 23 04:53:13 localhost podman[303664]: 2025-11-23 09:53:13.543815522 +0000 UTC m=+0.162645639 container died 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:53:13 localhost podman[303684]: 2025-11-23 09:53:13.637852392 +0000 UTC m=+0.081366851 container remove 89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_napier, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:53:13 localhost systemd[1]: libpod-conmon-89af5e268791bee141ff4fd7bba3db20d4469b29294a25bbe8220a368bf40f05.scope: Deactivated successfully. Nov 23 04:53:13 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:13 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:13 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:13 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:13 localhost systemd[1]: var-lib-containers-storage-overlay-5ceeb530356ef132c2f1ea8a59e40c7004053bced38a643a03cd26874370f264-merged.mount: Deactivated successfully. Nov 23 04:53:14 localhost podman[303761]: Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.46304007 +0000 UTC m=+0.077620366 container create 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:53:14 localhost ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:14 localhost ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:14 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:14 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:14 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:53:14 localhost ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:14 localhost ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:14 localhost systemd[1]: Started libpod-conmon-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope. Nov 23 04:53:14 localhost systemd[1]: Started libcrun container. Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.428309149 +0000 UTC m=+0.042889465 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.532222265 +0000 UTC m=+0.146802581 container init 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.542047318 +0000 UTC m=+0.156627614 container start 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, version=7, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.542388598 +0000 UTC m=+0.156968914 container attach 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, version=7, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:14 localhost inspiring_villani[303777]: 167 167 Nov 23 04:53:14 localhost systemd[1]: libpod-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope: Deactivated successfully. Nov 23 04:53:14 localhost podman[303761]: 2025-11-23 09:53:14.546498835 +0000 UTC m=+0.161079161 container died 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, vcs-type=git, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.component=rhceph-container) Nov 23 04:53:14 localhost podman[303782]: 2025-11-23 09:53:14.637954536 +0000 UTC m=+0.082962590 container remove 6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, release=553, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:53:14 localhost systemd[1]: libpod-conmon-6cf1cc422aaa1f50398c98e7261e2676cfc100f00e2f46197adab3524c32b947.scope: Deactivated successfully. Nov 23 04:53:14 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:14 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:14 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:14 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:14 localhost systemd[1]: var-lib-containers-storage-overlay-7f6c70242f52446dac1dcd3eb1e2c564747320354fb2a8253c3ac502d1630808-merged.mount: Deactivated successfully. Nov 23 04:53:15 localhost nova_compute[281952]: 2025-11-23 09:53:15.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:15 localhost podman[303858]: Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.479555131 +0000 UTC m=+0.068443453 container create 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:53:15 localhost systemd[1]: Started libpod-conmon-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope. Nov 23 04:53:15 localhost systemd[1]: Started libcrun container. Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.546562358 +0000 UTC m=+0.135450690 container init 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, name=rhceph, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64) Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.455579871 +0000 UTC m=+0.044468173 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.557736193 +0000 UTC m=+0.146624525 container start 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, release=553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True) Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.55796845 +0000 UTC m=+0.146856772 container attach 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, release=553, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:15 localhost jovial_volhard[303874]: 167 167 Nov 23 04:53:15 localhost systemd[1]: libpod-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope: Deactivated successfully. Nov 23 04:53:15 localhost podman[303858]: 2025-11-23 09:53:15.560634442 +0000 UTC m=+0.149522764 container died 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.33.12, name=rhceph) Nov 23 04:53:15 localhost podman[303879]: 2025-11-23 09:53:15.65101952 +0000 UTC m=+0.079301207 container remove 43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_volhard, release=553, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:15 localhost systemd[1]: libpod-conmon-43daed45e364990c228199f819e4ba0e98228bc9e596fe6e7807be230ee9e1fe.scope: Deactivated successfully. Nov 23 04:53:15 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:15 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:15 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:15 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:15 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:15 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:15 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:15 localhost systemd[1]: tmp-crun.qnQnne.mount: Deactivated successfully. Nov 23 04:53:15 localhost systemd[1]: var-lib-containers-storage-overlay-413c556e0e98758aae564a1a7fa9c1190a15dd9491b65f66503e920a26dd779a-merged.mount: Deactivated successfully. Nov 23 04:53:16 localhost podman[303947]: Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.356774914 +0000 UTC m=+0.078932827 container create 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc.) Nov 23 04:53:16 localhost systemd[1]: Started libpod-conmon-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope. Nov 23 04:53:16 localhost systemd[1]: Started libcrun container. Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.418525559 +0000 UTC m=+0.140683472 container init 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55) Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.323369813 +0000 UTC m=+0.045527786 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.428328331 +0000 UTC m=+0.150486244 container start 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, release=553, CEPH_POINT_RELEASE=, ceph=True) Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.428554188 +0000 UTC m=+0.150712131 container attach 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.33.12, com.redhat.component=rhceph-container) Nov 23 04:53:16 localhost goofy_blackwell[303963]: 167 167 Nov 23 04:53:16 localhost systemd[1]: libpod-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope: Deactivated successfully. Nov 23 04:53:16 localhost podman[303947]: 2025-11-23 09:53:16.431132588 +0000 UTC m=+0.153290531 container died 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, release=553, ceph=True, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Nov 23 04:53:16 localhost podman[303968]: 2025-11-23 09:53:16.520416072 +0000 UTC m=+0.076532002 container remove 7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_blackwell, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:16 localhost systemd[1]: libpod-conmon-7d6c945144499f81f36cc25939fb9532c0c100d7ebc2b681b6821488256f9a94.scope: Deactivated successfully. Nov 23 04:53:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:53:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:53:16 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:53:16 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:53:16 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:16 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:16 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:16 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:53:16 localhost systemd[1]: var-lib-containers-storage-overlay-8840d7c18e69bc4b3f17c85ae92b3622af07e263943383135ddeed63f20ffc20-merged.mount: Deactivated successfully. Nov 23 04:53:17 localhost podman[304037]: Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.373262153 +0000 UTC m=+0.074475629 container create 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.buildah.version=1.33.12, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7) Nov 23 04:53:17 localhost systemd[1]: Started libpod-conmon-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope. Nov 23 04:53:17 localhost systemd[1]: Started libcrun container. Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.342569626 +0000 UTC m=+0.043783112 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.446476952 +0000 UTC m=+0.147690428 container init 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main) Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.454623703 +0000 UTC m=+0.155837169 container start 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=553, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, name=rhceph) Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.454916042 +0000 UTC m=+0.156129548 container attach 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:53:17 localhost practical_golick[304052]: 167 167 Nov 23 04:53:17 localhost systemd[1]: libpod-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope: Deactivated successfully. Nov 23 04:53:17 localhost podman[304037]: 2025-11-23 09:53:17.456549503 +0000 UTC m=+0.157762999 container died 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Nov 23 04:53:17 localhost podman[304057]: 2025-11-23 09:53:17.551083589 +0000 UTC m=+0.081292989 container remove 879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_golick, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, version=7, distribution-scope=public, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:17 localhost systemd[1]: libpod-conmon-879194f7e43749a7e2744ccf67bfb085afa50d026ff89b6849c286453aa35f12.scope: Deactivated successfully. Nov 23 04:53:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:17 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:17 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:17 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:17 localhost ceph-mon[300199]: Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:53:17 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:53:17 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:17 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:17 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:17 localhost systemd[1]: var-lib-containers-storage-overlay-8de9a3bc21e15e1fba062036739d1c9a95aa30d6ec253814481023ba07294928-merged.mount: Deactivated successfully. Nov 23 04:53:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:18 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:18 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:18 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:18 localhost ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:18 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:18 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:18 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:18 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:53:19 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:19 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:19 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:19 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:19 localhost ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:19 localhost ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:19 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:19 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:19 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:53:20 localhost nova_compute[281952]: 2025-11-23 09:53:20.396 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:53:20 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:53:20 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:53:20 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:53:20 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:53:20 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:20 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:20 localhost ceph-mgr[288287]: [cephadm INFO root] Saving service mon spec with placement label:mon Nov 23 04:53:20 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Nov 23 04:53:21 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:21 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:21 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:21 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:21 localhost ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:21 localhost ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:21 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:22 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005532586", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 23 04:53:22 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:53:22 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:53:22 localhost ceph-mon[300199]: Saving service mon spec with placement label:mon Nov 23 04:53:22 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:22 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:22 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:22 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:22 localhost ceph-mon[300199]: mon.np0005532585@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:23 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:23 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:23 localhost ceph-mgr[288287]: [progress INFO root] Completed event e59b1355-8ccd-4c28-832b-9038b911cfdf (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 23 04:53:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:53:23 localhost podman[304158]: 2025-11-23 09:53:23.447051493 +0000 UTC m=+0.093128813 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm) Nov 23 04:53:23 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:23 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:23 localhost podman[304158]: 2025-11-23 09:53:23.486442759 +0000 UTC m=+0.132520039 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:53:23 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:53:23 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54155 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005532586"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:23 localhost ceph-mgr[288287]: [cephadm INFO root] Remove daemons mon.np0005532586 Nov 23 04:53:23 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005532586 Nov 23 04:53:23 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585']) Nov 23 04:53:23 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585']) Nov 23 04:53:23 localhost ceph-mgr[288287]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005532586 from monmap... Nov 23 04:53:23 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing monitor np0005532586 from monmap... Nov 23 04:53:23 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports [] Nov 23 04:53:23 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports [] Nov 23 04:53:23 localhost ceph-mon[300199]: mon.np0005532585@2(peon) e16 my rank is now 1 (was 2) Nov 23 04:53:23 localhost ceph-mgr[288287]: client.27018 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:53:23 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:53:23 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:53:23 localhost ceph-mgr[288287]: client.27015 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 23 04:53:23 localhost ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532584"} v 0) Nov 23 04:53:23 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch Nov 23 04:53:23 localhost ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532585"} v 0) Nov 23 04:53:23 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch Nov 23 04:53:23 localhost ceph-mon[300199]: mon.np0005532585@1(probing) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:23 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:24 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1151827140 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x55eed51f6c00 0x55eed51fcb00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:53:24 localhost ceph-mgr[288287]: --2- 172.18.0.107:0/1518513680 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55eed629b400 0x55eed4fc1700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 23 04:53:24 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:53:24 localhost ceph-mon[300199]: paxos.1).electionLogic(60) init, last seen epoch 60 Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e16 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e16 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 23 04:53:24 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:24 localhost ceph-mon[300199]: Remove daemons mon.np0005532586 Nov 23 04:53:24 localhost ceph-mon[300199]: Safe to remove mon.np0005532586: new quorum should be ['np0005532584', 'np0005532585'] (from ['np0005532584', 'np0005532585']) Nov 23 04:53:24 localhost ceph-mon[300199]: Removing monitor np0005532586 from monmap... Nov 23 04:53:24 localhost ceph-mon[300199]: Removing daemon mon.np0005532586 from np0005532586.localdomain -- ports [] Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:53:24 localhost ceph-mon[300199]: mon.np0005532584 is new leader, mons np0005532584,np0005532585 in quorum (ranks 0,1) Nov 23 04:53:24 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:24 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:24 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:25 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:25 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:25 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:25 localhost nova_compute[281952]: 2025-11-23 09:53:25.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:25 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 23 04:53:25 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:25 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:25 localhost ceph-mgr[288287]: [progress INFO root] Completed event a56fc888-20b8-44aa-a4ef-7fb77787df56 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 23 04:53:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 23 04:53:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 23 04:53:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:53:26 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:26 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:26 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:26 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:26 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:26 localhost podman[304515]: 2025-11-23 09:53:26.08452531 +0000 UTC m=+0.087841390 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:53:26 localhost podman[304515]: 2025-11-23 09:53:26.100334258 +0000 UTC m=+0.103650338 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:53:26 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:53:26 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:26 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:26 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:26 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:26 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:26 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:26 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 23 04:53:26 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:53:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:26 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:26 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:27 localhost ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:27 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:27 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:27 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:27 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:53:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 23 04:53:28 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:28 localhost ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:28 localhost ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:28 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:28 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:28 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:53:28 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 23 04:53:28 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:28 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:28 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:29 localhost ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:29 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:29 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:29 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:29 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:29 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:29 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:29 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:29 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:29 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 23 04:53:29 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 23 04:53:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch Nov 23 04:53:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:29 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:29 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:29 localhost nova_compute[281952]: 2025-11-23 09:53:29.934 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:53:29 localhost nova_compute[281952]: 2025-11-23 09:53:29.965 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 23 04:53:29 localhost nova_compute[281952]: 2025-11-23 09:53:29.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:53:29 localhost nova_compute[281952]: 2025-11-23 09:53:29.967 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:53:29 localhost openstack_network_exporter[242668]: ERROR 09:53:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:53:29 localhost openstack_network_exporter[242668]: ERROR 09:53:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:53:29 localhost openstack_network_exporter[242668]: ERROR 09:53:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:53:29 localhost openstack_network_exporter[242668]: ERROR 09:53:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:53:29 localhost openstack_network_exporter[242668]: Nov 23 04:53:29 localhost openstack_network_exporter[242668]: ERROR 09:53:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:53:29 localhost openstack_network_exporter[242668]: Nov 23 04:53:30 localhost nova_compute[281952]: 2025-11-23 09:53:30.012 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:53:30 localhost nova_compute[281952]: 2025-11-23 09:53:30.401 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:30 localhost nova_compute[281952]: 2025-11-23 09:53:30.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:30 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:30 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:30 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:30 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:30 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:30 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:30 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:30 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:53:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:30 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:30 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:30 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:31 localhost podman[304589]: Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.214935727 +0000 UTC m=+0.074529780 container create b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main) Nov 23 04:53:31 localhost systemd[1]: Started libpod-conmon-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope. Nov 23 04:53:31 localhost systemd[1]: Started libcrun container. Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.183082194 +0000 UTC m=+0.042676317 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.285603037 +0000 UTC m=+0.145197100 container init b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.293877402 +0000 UTC m=+0.153471475 container start b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, release=553, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.294229533 +0000 UTC m=+0.153823596 container attach b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux ) Nov 23 04:53:31 localhost pensive_euclid[304604]: 167 167 Nov 23 04:53:31 localhost systemd[1]: libpod-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope: Deactivated successfully. Nov 23 04:53:31 localhost podman[304589]: 2025-11-23 09:53:31.296159663 +0000 UTC m=+0.155753716 container died b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:53:31 localhost podman[304609]: 2025-11-23 09:53:31.380226277 +0000 UTC m=+0.071827278 container remove b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_euclid, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:31 localhost systemd[1]: libpod-conmon-b6a197b98120aa0e0f6602d91fe9d99fe16e733c1d7889f76d50fc5bf5d8f198.scope: Deactivated successfully. Nov 23 04:53:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:31 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:31 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Nov 23 04:53:31 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:53:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:31 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:31 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:31 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:31 localhost ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:31 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:31 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:31 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:31 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:31 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:53:32 localhost podman[304680]: Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.078412056 +0000 UTC m=+0.081747963 container create 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, release=553) Nov 23 04:53:32 localhost systemd[1]: Started libpod-conmon-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope. Nov 23 04:53:32 localhost systemd[1]: Started libcrun container. Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.140057768 +0000 UTC m=+0.143393675 container init 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main) Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.042963352 +0000 UTC m=+0.046299259 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.150452829 +0000 UTC m=+0.153788736 container start 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, name=rhceph, vcs-type=git, version=7, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main) Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.15084068 +0000 UTC m=+0.154176587 container attach 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, RELEASE=main, ceph=True) Nov 23 04:53:32 localhost competent_noyce[304695]: 167 167 Nov 23 04:53:32 localhost systemd[1]: libpod-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope: Deactivated successfully. Nov 23 04:53:32 localhost podman[304680]: 2025-11-23 09:53:32.153611326 +0000 UTC m=+0.156947263 container died 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, description=Red Hat Ceph Storage 7, release=553, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:53:32 localhost systemd[1]: var-lib-containers-storage-overlay-2bac5ef8c68c2242ae1212afde9f25aede8b2dfc4d4bf412b25b4a9e1b82837f-merged.mount: Deactivated successfully. Nov 23 04:53:32 localhost systemd[1]: var-lib-containers-storage-overlay-552b7356ded64562fa112310d8ffa65bce3d7c0257ead807842473869640e976-merged.mount: Deactivated successfully. Nov 23 04:53:32 localhost podman[304700]: 2025-11-23 09:53:32.250330049 +0000 UTC m=+0.087190380 container remove 9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_noyce, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 23 04:53:32 localhost systemd[1]: libpod-conmon-9f05c5492424b575521cc156c515c9dd6a95f0cf4847cf0244a6c83b9ab9bf3b.scope: Deactivated successfully. Nov 23 04:53:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:32 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:32 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Nov 23 04:53:32 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:53:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:32 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:32 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:32 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:32 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:33 localhost podman[304777]: Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.048201304 +0000 UTC m=+0.080004988 container create 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux , release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7) Nov 23 04:53:33 localhost systemd[1]: Started libpod-conmon-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope. Nov 23 04:53:33 localhost systemd[1]: Started libcrun container. Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.114293094 +0000 UTC m=+0.146096778 container init 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, release=553, architecture=x86_64, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc.) Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.017372613 +0000 UTC m=+0.049176337 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.123500018 +0000 UTC m=+0.155303702 container start 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git) Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.123801677 +0000 UTC m=+0.155605361 container attach 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., release=553, name=rhceph, distribution-scope=public, vcs-type=git, ceph=True, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=) Nov 23 04:53:33 localhost stoic_villani[304793]: 167 167 Nov 23 04:53:33 localhost systemd[1]: libpod-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope: Deactivated successfully. Nov 23 04:53:33 localhost podman[304777]: 2025-11-23 09:53:33.126247692 +0000 UTC m=+0.158051406 container died 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main) Nov 23 04:53:33 localhost systemd[1]: var-lib-containers-storage-overlay-ae27c3a1e2a4b872da410046d936a4c60bed3350896f3d7ac5e09e7da49b3cb5-merged.mount: Deactivated successfully. Nov 23 04:53:33 localhost podman[304798]: 2025-11-23 09:53:33.238009 +0000 UTC m=+0.099953505 container remove 30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Nov 23 04:53:33 localhost systemd[1]: libpod-conmon-30e03bbdc8763ba821134734ec0ae60c10a435ee699fb41b18f3bd0e78168660.scope: Deactivated successfully. Nov 23 04:53:33 localhost ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:33 localhost ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:33 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:33 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:33 localhost ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:33 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:53:33 localhost ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:33 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:33 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 23 04:53:33 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:33 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:33 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:33 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:34 localhost podman[304874]: Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.067777969 +0000 UTC m=+0.078562574 container create cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:34 localhost systemd[1]: Started libpod-conmon-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope. Nov 23 04:53:34 localhost systemd[1]: Started libcrun container. Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.12905646 +0000 UTC m=+0.139841045 container init cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.037452083 +0000 UTC m=+0.048236718 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.139733559 +0000 UTC m=+0.150518144 container start cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, maintainer=Guillaume Abrioux , ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.139912565 +0000 UTC m=+0.150697150 container attach cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git) Nov 23 04:53:34 localhost naughty_pascal[304889]: 167 167 Nov 23 04:53:34 localhost systemd[1]: libpod-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope: Deactivated successfully. Nov 23 04:53:34 localhost podman[304874]: 2025-11-23 09:53:34.144305131 +0000 UTC m=+0.155089716 container died cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, name=rhceph, RELEASE=main, GIT_CLEAN=True) Nov 23 04:53:34 localhost systemd[1]: tmp-crun.8BhCJP.mount: Deactivated successfully. Nov 23 04:53:34 localhost systemd[1]: var-lib-containers-storage-overlay-c37fb1671fd0adf14e59310f87a9848b621d966f12b8c33e0aa6ce15e89f110b-merged.mount: Deactivated successfully. Nov 23 04:53:34 localhost podman[304894]: 2025-11-23 09:53:34.240497658 +0000 UTC m=+0.082255258 container remove cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pascal, release=553, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, version=7, GIT_CLEAN=True, io.openshift.expose-services=) Nov 23 04:53:34 localhost systemd[1]: libpod-conmon-cee8de3fd65a8f518416cf3d08be4e4331f07a915cd38ed359be3163522d7305.scope: Deactivated successfully. Nov 23 04:53:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:34 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:34 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 23 04:53:34 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 23 04:53:34 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch Nov 23 04:53:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:34 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:34 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:34 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:34 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:34 localhost podman[304929]: 2025-11-23 09:53:34.529634508 +0000 UTC m=+0.082215737 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true) Nov 23 04:53:34 localhost podman[304929]: 2025-11-23 09:53:34.563284536 +0000 UTC m=+0.115865795 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:53:34 localhost podman[304927]: 2025-11-23 09:53:34.592842688 +0000 UTC m=+0.143666174 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:53:34 localhost podman[304927]: 2025-11-23 09:53:34.654415708 +0000 UTC m=+0.205239184 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller) Nov 23 04:53:34 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:53:34 localhost podman[304930]: 2025-11-23 09:53:34.668663857 +0000 UTC m=+0.218396998 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal) Nov 23 04:53:34 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:53:34 localhost podman[304930]: 2025-11-23 09:53:34.709172007 +0000 UTC m=+0.258905148 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, version=9.6, config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 04:53:34 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:53:34 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:34 localhost podman[305027]: Nov 23 04:53:34 localhost podman[305027]: 2025-11-23 09:53:34.961553543 +0000 UTC m=+0.082824676 container create 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, RELEASE=main, io.buildah.version=1.33.12, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:35 localhost systemd[1]: Started libpod-conmon-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope. Nov 23 04:53:35 localhost systemd[1]: Started libcrun container. Nov 23 04:53:35 localhost podman[305027]: 2025-11-23 09:53:34.925306624 +0000 UTC m=+0.046577787 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:35 localhost podman[305027]: 2025-11-23 09:53:35.025836866 +0000 UTC m=+0.147107989 container init 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, version=7, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:53:35 localhost podman[305027]: 2025-11-23 09:53:35.036228977 +0000 UTC m=+0.157500110 container start 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=553, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public) Nov 23 04:53:35 localhost podman[305027]: 2025-11-23 09:53:35.036533116 +0000 UTC m=+0.157804279 container attach 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., release=553, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:53:35 localhost gallant_euler[305042]: 167 167 Nov 23 04:53:35 localhost systemd[1]: libpod-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope: Deactivated successfully. Nov 23 04:53:35 localhost podman[305027]: 2025-11-23 09:53:35.039858969 +0000 UTC m=+0.161130132 container died 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.buildah.version=1.33.12, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, RELEASE=main) Nov 23 04:53:35 localhost podman[305047]: 2025-11-23 09:53:35.137705107 +0000 UTC m=+0.085605791 container remove 732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_euler, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, release=553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:53:35 localhost systemd[1]: libpod-conmon-732993ff829dab2df6beac49e3d4fcc1436149ba0496fa0b52c3b44f75c76057.scope: Deactivated successfully. Nov 23 04:53:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:35 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:35 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:35 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:35 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:35 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:35 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:35 localhost nova_compute[281952]: 2025-11-23 09:53:35.403 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:35 localhost nova_compute[281952]: 2025-11-23 09:53:35.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:35 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:35 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:35 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:35 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:35 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:36 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:36 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Nov 23 04:53:36 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:53:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:36 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:36 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:36 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:36 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:37 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005532586.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:37 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:53:37 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:37 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:37 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:37 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:37 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:37 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:37 localhost ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:37 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:53:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:38 localhost ceph-mon[300199]: Deploying daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:53:38 localhost ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:53:38 localhost ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:53:38 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:40 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:40 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:53:40 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:40 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:53:40 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:53:40 localhost nova_compute[281952]: 2025-11-23 09:53:40.406 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:40 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:40 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (2) No such file or directory Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532584"} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532584"} : dispatch Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532585"} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532585"} : dispatch Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(cluster) log [INF] : mon.np0005532585 calling monitor election Nov 23 04:53:40 localhost ceph-mon[300199]: paxos.1).electionLogic(62) init, last seen epoch 62 Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:40 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:40 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:40 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:40 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:41 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:41 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:41 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:41 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:41 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:41 localhost podman[240668]: time="2025-11-23T09:53:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:53:41 localhost podman[240668]: @ - - [23/Nov/2025:09:53:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:53:41 localhost podman[240668]: @ - - [23/Nov/2025:09:53:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18709 "" "Go-http-client/1.1" Nov 23 04:53:42 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:42 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:42 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:42 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:42 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [] Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] scanning for idle connections.. Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Nov 23 04:53:42 localhost ceph-mgr[288287]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Nov 23 04:53:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:53:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:53:43 localhost podman[305065]: 2025-11-23 09:53:43.069296943 +0000 UTC m=+0.082482935 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:53:43 localhost systemd[1]: tmp-crun.vZp3B3.mount: Deactivated successfully. Nov 23 04:53:43 localhost podman[305064]: 2025-11-23 09:53:43.127709255 +0000 UTC m=+0.142588850 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:53:43 localhost podman[305065]: 2025-11-23 09:53:43.136943691 +0000 UTC m=+0.150129673 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:53:43 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:53:43 localhost podman[305064]: 2025-11-23 09:53:43.164818791 +0000 UTC m=+0.179698326 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:53:43 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:53:43 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:43 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:43 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:43 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:44 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:44 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:44 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:44 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:44 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.410 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:53:45 localhost nova_compute[281952]: 2025-11-23 09:53:45.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:45 localhost ceph-mds[287052]: mds.beacon.mds.np0005532585.jcltnl missed beacon ack from the monitors Nov 23 04:53:45 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:45 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:45 localhost ceph-mgr[288287]: mgr finish mon failed to return metadata for mon.np0005532586: (22) Invalid argument Nov 23 04:53:45 localhost ceph-mon[300199]: paxos.1).electionLogic(63) init, last seen epoch 63, mid-election, bumping Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:45 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:45 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 23 04:53:45 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532584 calling monitor election Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585 calling monitor election Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532586 calling monitor election Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532584 is new leader, mons np0005532584,np0005532585,np0005532586 in quorum (ranks 0,1,2) Nov 23 04:53:45 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 04:53:45 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:45 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 23 04:53:45 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch Nov 23 04:53:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:45 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:45 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:45 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:46 localhost ceph-mgr[288287]: mgr.server handle_open ignoring open from mon.np0005532586 172.18.0.108:0/2160972868; not ready for session (expect reconnect) Nov 23 04:53:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005532586"} v 0) Nov 23 04:53:46 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mon metadata", "id": "np0005532586"} : dispatch Nov 23 04:53:46 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532586.thmvqb (monmap changed)... Nov 23 04:53:46 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:46 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532586.thmvqb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:46 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532586.thmvqb on np0005532586.localdomain Nov 23 04:53:46 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:46 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:46 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:47 localhost ceph-mgr[288287]: mgr.server handle_report got status from non-daemon mon.np0005532586 Nov 23 04:53:47 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:53:47.518+0000 7f5ec5b52640 -1 mgr.server handle_report got status from non-daemon mon.np0005532586 Nov 23 04:53:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:48 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 23 04:53:48 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:48 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:48 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v51: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:49 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:49 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:49 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:49 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:53:49 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:49 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:49 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 23 04:53:49 localhost ceph-mgr[288287]: [progress INFO root] update: starting ev b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:49 localhost ceph-mgr[288287]: [progress INFO root] complete: finished ev b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3)) Nov 23 04:53:49 localhost ceph-mgr[288287]: [progress INFO root] Completed event b7355ba6-1464-472e-87d9-7de50c82b6b7 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 23 04:53:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 23 04:53:49 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 23 04:53:50 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:50 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:50 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:50 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:50 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:50 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:50 localhost nova_compute[281952]: 2025-11-23 09:53:50.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:50 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:50 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:50 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:50 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532584.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:50 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v52: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:50 localhost ceph-mgr[288287]: [progress INFO root] Writing back 50 completed events Nov 23 04:53:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 23 04:53:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:51 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:51 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 23 04:53:51 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:53:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:51 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:51 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:51 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 23 04:53:51 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2073028298' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 23 04:53:51 localhost ceph-mon[300199]: Reconfiguring crash.np0005532584 (monmap changed)... Nov 23 04:53:51 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532584 on np0005532584.localdomain Nov 23 04:53:51 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:51 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:51 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:51 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:53:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:52 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:52 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 23 04:53:52 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:53:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:52 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:52 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:52 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:52 localhost ceph-mon[300199]: Reconfiguring osd.2 (monmap changed)... Nov 23 04:53:52 localhost ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:53:52 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:52 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:52 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.471623) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632471708, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2808, "num_deletes": 259, "total_data_size": 5572811, "memory_usage": 5643928, "flush_reason": "Manual Compaction"} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632491071, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 3233862, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12506, "largest_seqno": 15309, "table_properties": {"data_size": 3222584, "index_size": 6759, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3461, "raw_key_size": 31164, "raw_average_key_size": 22, "raw_value_size": 3196957, "raw_average_value_size": 2348, "num_data_blocks": 296, "num_entries": 1361, "num_filter_entries": 1361, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891560, "oldest_key_time": 1763891560, "file_creation_time": 1763891632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 19511 microseconds, and 8603 cpu microseconds. Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.491140) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 3233862 bytes OK Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.491167) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493538) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493564) EVENT_LOG_v1 {"time_micros": 1763891632493556, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.493587) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 5558860, prev total WAL file size 5558860, number of live WAL files 2. Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.495147) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(3158KB)], [18(16MB)] Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632495246, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 20139804, "oldest_snapshot_seqno": -1} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 11902 keys, 17224579 bytes, temperature: kUnknown Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632580427, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17224579, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17154272, "index_size": 39486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 318266, "raw_average_key_size": 26, "raw_value_size": 16948901, "raw_average_value_size": 1424, "num_data_blocks": 1513, "num_entries": 11902, "num_filter_entries": 11902, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.580812) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17224579 bytes Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.582764) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 236.1 rd, 201.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 16.1 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(11.6) write-amplify(5.3) OK, records in: 12447, records dropped: 545 output_compression: NoCompression Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.582796) EVENT_LOG_v1 {"time_micros": 1763891632582782, "job": 8, "event": "compaction_finished", "compaction_time_micros": 85306, "compaction_time_cpu_micros": 49640, "output_level": 6, "num_output_files": 1, "total_output_size": 17224579, "num_input_records": 12447, "num_output_records": 11902, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632583366, "job": 8, "event": "table_file_deletion", "file_number": 20} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891632585744, "job": 8, "event": "table_file_deletion", "file_number": 18} Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.494941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:53:52.585793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:53:52 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v53: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:53 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:53 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:53 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:53 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:53 localhost ceph-mon[300199]: Reconfiguring osd.5 (monmap changed)... Nov 23 04:53:53 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:53:53 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:53 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:53 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:53 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532584.aoxjmw (monmap changed)... Nov 23 04:53:53 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532584.aoxjmw", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:53 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532584.aoxjmw on np0005532584.localdomain Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:53:53 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:53 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch Nov 23 04:53:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:53 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:53 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:53 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:54 localhost podman[305444]: 2025-11-23 09:53:54.027923154 +0000 UTC m=+0.086937813 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118) Nov 23 04:53:54 localhost podman[305444]: 2025-11-23 09:53:54.042374979 +0000 UTC m=+0.101389598 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 04:53:54 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:53:54 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v54: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Nov 23 04:53:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:54 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:54 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:54 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:54 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:54 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:54 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:54 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532584.naxwxy (monmap changed)... Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532584.naxwxy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:54 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532584.naxwxy on np0005532584.localdomain Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:54 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532585.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:55 localhost nova_compute[281952]: 2025-11-23 09:53:55.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:53:55 localhost podman[305517]: Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.483313384 +0000 UTC m=+0.084760266 container create 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux ) Nov 23 04:53:55 localhost systemd[1]: Started libpod-conmon-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope. Nov 23 04:53:55 localhost systemd[1]: Started libcrun container. Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.446867339 +0000 UTC m=+0.048314261 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.559828954 +0000 UTC m=+0.161275846 container init 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, release=553, io.buildah.version=1.33.12, io.openshift.expose-services=) Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.569882665 +0000 UTC m=+0.171329527 container start 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, release=553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.570079191 +0000 UTC m=+0.171526123 container attach 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Nov 23 04:53:55 localhost silly_johnson[305532]: 167 167 Nov 23 04:53:55 localhost systemd[1]: libpod-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope: Deactivated successfully. Nov 23 04:53:55 localhost podman[305517]: 2025-11-23 09:53:55.574185897 +0000 UTC m=+0.175632799 container died 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, maintainer=Guillaume Abrioux , RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:53:55 localhost podman[305537]: 2025-11-23 09:53:55.658774746 +0000 UTC m=+0.076318895 container remove 7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_johnson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:55 localhost systemd[1]: libpod-conmon-7c1efcd96fc50f4f93ee51c2a39a597ee180dbe7639a20c97c3164433a0a9d70.scope: Deactivated successfully. Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mgr[288287]: log_channel(audit) log [DBG] : from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch Nov 23 04:53:55 localhost ceph-mgr[288287]: [cephadm INFO root] Reconfig service osd.default_drive_group Nov 23 04:53:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:55 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532584.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:53:55 localhost ceph-mon[300199]: Reconfiguring crash.np0005532585 (monmap changed)... Nov 23 04:53:55 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532585 on np0005532585.localdomain Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:55 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:53:56 localhost podman[305605]: 2025-11-23 09:53:56.40050747 +0000 UTC m=+0.094874469 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:53:56 localhost podman[305605]: 2025-11-23 09:53:56.433493017 +0000 UTC m=+0.127859976 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:53:56 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:53:56 localhost podman[305612]: Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.459089877 +0000 UTC m=+0.132053165 container create cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph) Nov 23 04:53:56 localhost systemd[1]: Started libpod-conmon-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope. Nov 23 04:53:56 localhost systemd[1]: var-lib-containers-storage-overlay-2e23d4e8adf6283b26f1de8b6f4e7188602d3d0a3991af1e5a58cd866a31fd1f-merged.mount: Deactivated successfully. Nov 23 04:53:56 localhost systemd[1]: Started libcrun container. Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.515346872 +0000 UTC m=+0.188310160 container init cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55) Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.525915119 +0000 UTC m=+0.198878407 container start cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.526080844 +0000 UTC m=+0.199044132 container attach cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.33.12, release=553, version=7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Nov 23 04:53:56 localhost cranky_merkle[305646]: 167 167 Nov 23 04:53:56 localhost systemd[1]: libpod-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope: Deactivated successfully. Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.530465119 +0000 UTC m=+0.203428427 container died cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, name=rhceph) Nov 23 04:53:56 localhost podman[305612]: 2025-11-23 09:53:56.432050062 +0000 UTC m=+0.105013370 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:56 localhost podman[305651]: 2025-11-23 09:53:56.636530591 +0000 UTC m=+0.092007079 container remove cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_merkle, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux ) Nov 23 04:53:56 localhost systemd[1]: libpod-conmon-cff0d059b1a5a990b5e9f103c058de5057ff6fbf6770ab0ac951a540ae7c3daa.scope: Deactivated successfully. Nov 23 04:53:56 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v55: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Nov 23 04:53:56 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:53:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:56 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:56 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:56 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:56 localhost ceph-mon[300199]: Reconfig service osd.default_drive_group Nov 23 04:53:56 localhost ceph-mon[300199]: Reconfiguring osd.0 (monmap changed)... Nov 23 04:53:56 localhost ceph-mon[300199]: Reconfiguring daemon osd.0 on np0005532585.localdomain Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:56 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 23 04:53:57 localhost podman[305728]: Nov 23 04:53:57 localhost systemd[1]: var-lib-containers-storage-overlay-627a3ecf9cef10f3d1f6014149f0a88b728aaff6dcf7bdbd90337aed590af978-merged.mount: Deactivated successfully. Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.500217787 +0000 UTC m=+0.079272367 container create 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=) Nov 23 04:53:57 localhost systemd[1]: Started libpod-conmon-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope. Nov 23 04:53:57 localhost systemd[1]: Started libcrun container. Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.465172215 +0000 UTC m=+0.044226815 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.56675973 +0000 UTC m=+0.145814310 container init 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7) Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.57584111 +0000 UTC m=+0.154895690 container start 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, GIT_BRANCH=main, release=553, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, io.openshift.tags=rhceph ceph) Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.576139589 +0000 UTC m=+0.155194179 container attach 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, RELEASE=main, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Nov 23 04:53:57 localhost heuristic_mclaren[305743]: 167 167 Nov 23 04:53:57 localhost systemd[1]: libpod-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope: Deactivated successfully. Nov 23 04:53:57 localhost podman[305728]: 2025-11-23 09:53:57.580852945 +0000 UTC m=+0.159907555 container died 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, RELEASE=main, io.openshift.expose-services=, vcs-type=git) Nov 23 04:53:57 localhost podman[305748]: 2025-11-23 09:53:57.682927323 +0000 UTC m=+0.089701708 container remove 40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_mclaren, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, RELEASE=main, release=553, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55) Nov 23 04:53:57 localhost systemd[1]: libpod-conmon-40ec13f698d2fe5d4ae74f6094ce16662e1ee8f0a551d555d83af31f323ad534.scope: Deactivated successfully. Nov 23 04:53:57 localhost sshd[305767]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:57 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:57 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 23 04:53:57 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:57 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:57 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:57 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:58 localhost ceph-mon[300199]: Reconfiguring osd.3 (monmap changed)... Nov 23 04:53:58 localhost ceph-mon[300199]: Reconfiguring daemon osd.3 on np0005532585.localdomain Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:58 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532585.jcltnl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:53:58 localhost systemd[1]: var-lib-containers-storage-overlay-b44c73eb5ff0b2833430782bf819cd408944db4dd8aad1159616aa45f110f56e-merged.mount: Deactivated successfully. Nov 23 04:53:58 localhost podman[305826]: Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.565409539 +0000 UTC m=+0.078587946 container create d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:58 localhost systemd[1]: Started libpod-conmon-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope. Nov 23 04:53:58 localhost systemd[1]: Started libcrun container. Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.530717609 +0000 UTC m=+0.043896016 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.63580136 +0000 UTC m=+0.148979737 container init d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7) Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.646715967 +0000 UTC m=+0.159894344 container start d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , release=553, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.647948005 +0000 UTC m=+0.161126442 container attach d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, io.openshift.tags=rhceph ceph) Nov 23 04:53:58 localhost musing_moser[305841]: 167 167 Nov 23 04:53:58 localhost systemd[1]: libpod-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope: Deactivated successfully. Nov 23 04:53:58 localhost podman[305826]: 2025-11-23 09:53:58.650595566 +0000 UTC m=+0.163773963 container died d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux , release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 23 04:53:58 localhost podman[305846]: 2025-11-23 09:53:58.743146611 +0000 UTC m=+0.079733430 container remove d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_moser, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.buildah.version=1.33.12, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Nov 23 04:53:58 localhost systemd[1]: libpod-conmon-d495830fdfec515b264a48dc7b5354956f85f03047524d8166e2d9ba98f0885e.scope: Deactivated successfully. Nov 23 04:53:58 localhost ceph-mgr[288287]: log_channel(cluster) log [DBG] : pgmap v56: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Nov 23 04:53:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:58 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:58 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:53:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 23 04:53:58 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 23 04:53:58 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "mgr services"} : dispatch Nov 23 04:53:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:58 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:58 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:58 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:53:59 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532585.jcltnl (monmap changed)... Nov 23 04:53:59 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532585.jcltnl on np0005532585.localdomain Nov 23 04:53:59 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:59 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:53:59 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:59 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005532585.gzafiw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 23 04:53:59 localhost podman[305915]: Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.408974413 +0000 UTC m=+0.068596117 container create fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.openshift.expose-services=, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, release=553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:53:59 localhost systemd[1]: Started libpod-conmon-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope. Nov 23 04:53:59 localhost systemd[1]: Started libcrun container. Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.458344737 +0000 UTC m=+0.117966441 container init fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.openshift.expose-services=, release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.465552928 +0000 UTC m=+0.125174632 container start fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, version=7) Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.465685022 +0000 UTC m=+0.125306736 container attach fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, release=553, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:53:59 localhost peaceful_blackwell[305930]: 167 167 Nov 23 04:53:59 localhost systemd[1]: libpod-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope: Deactivated successfully. Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.469388007 +0000 UTC m=+0.129009791 container died fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 23 04:53:59 localhost podman[305915]: 2025-11-23 09:53:59.387873123 +0000 UTC m=+0.047494917 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:53:59 localhost systemd[1]: var-lib-containers-storage-overlay-96e19531e99dc2ffad06b89eb3ebb6f5fc4b36721e44dc3a08d52edaf26b42bd-merged.mount: Deactivated successfully. Nov 23 04:53:59 localhost systemd[1]: var-lib-containers-storage-overlay-ca561776aa0e1f2ec164ea75e23726104b971a45983f6b547bbd3f80ea60c050-merged.mount: Deactivated successfully. Nov 23 04:53:59 localhost podman[305935]: 2025-11-23 09:53:59.55149126 +0000 UTC m=+0.071987002 container remove fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_blackwell, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container) Nov 23 04:53:59 localhost systemd[1]: libpod-conmon-fd3adbb492420b1e868996e2e7a0ea3b19185730beb6dd366b95bd1d1364d44a.scope: Deactivated successfully. Nov 23 04:53:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain.devices.0}] v 0) Nov 23 04:53:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532585.localdomain}] v 0) Nov 23 04:53:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:53:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 23 04:53:59 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:53:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:53:59 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:53:59 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:59 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:53:59 localhost openstack_network_exporter[242668]: ERROR 09:53:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:53:59 localhost openstack_network_exporter[242668]: ERROR 09:53:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:53:59 localhost openstack_network_exporter[242668]: Nov 23 04:53:59 localhost openstack_network_exporter[242668]: ERROR 09:53:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:53:59 localhost openstack_network_exporter[242668]: ERROR 09:53:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:53:59 localhost openstack_network_exporter[242668]: ERROR 09:53:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:53:59 localhost openstack_network_exporter[242668]: Nov 23 04:54:00 localhost ceph-mon[300199]: Reconfiguring mgr.np0005532585.gzafiw (monmap changed)... Nov 23 04:54:00 localhost ceph-mon[300199]: Reconfiguring daemon mgr.np0005532585.gzafiw on np0005532585.localdomain Nov 23 04:54:00 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:54:00 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:54:00 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:54:00 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005532586.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 23 04:54:00 localhost nova_compute[281952]: 2025-11-23 09:54:00.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain.devices.0}] v 0) Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005532586.localdomain}] v 0) Nov 23 04:54:00 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 23 04:54:00 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Nov 23 04:54:00 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 23 04:54:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 23 04:54:00 localhost ceph-mgr[288287]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:54:00 localhost ceph-mgr[288287]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 23 04:54:00 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3957521171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:54:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 e86: 6 total, 6 up, 6 in Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr handle_mgr_map I was active but no longer am Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn e: '/usr/bin/ceph-mgr' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 0: '/usr/bin/ceph-mgr' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 1: '-n' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 2: 'mgr.np0005532585.gzafiw' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 3: '-f' Nov 23 04:54:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.607+0000 7f5f21ba1640 -1 mgr handle_mgr_map I was active but no longer am Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 4: '--setuser' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 5: 'ceph' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 6: '--setgroup' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 7: 'ceph' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 8: '--default-log-to-file=false' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 9: '--default-log-to-journald=true' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn 10: '--default-log-to-stderr=false' Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn respawning with exe /usr/bin/ceph-mgr Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr respawn exe_path /proc/self/exe Nov 23 04:54:00 localhost systemd[1]: session-69.scope: Deactivated successfully. Nov 23 04:54:00 localhost systemd[1]: session-69.scope: Consumed 27.610s CPU time. Nov 23 04:54:00 localhost systemd-logind[761]: Session 69 logged out. Waiting for processes to exit. Nov 23 04:54:00 localhost systemd-logind[761]: Removed session 69. Nov 23 04:54:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: ignoring --setuser ceph since I am not root Nov 23 04:54:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: ignoring --setgroup ceph since I am not root Nov 23 04:54:00 localhost ceph-mgr[288287]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 23 04:54:00 localhost ceph-mgr[288287]: pidfile_write: ignore empty --pid-file Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr[py] Loading python module 'alerts' Nov 23 04:54:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.807+0000 7f5982561140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr[py] Loading python module 'balancer' Nov 23 04:54:00 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:00.873+0000 7f5982561140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 23 04:54:00 localhost ceph-mgr[288287]: mgr[py] Loading python module 'cephadm' Nov 23 04:54:00 localhost sshd[305976]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:54:01 localhost systemd-logind[761]: New session 70 of user ceph-admin. Nov 23 04:54:01 localhost systemd[1]: Started Session 70 of User ceph-admin. Nov 23 04:54:01 localhost ceph-mon[300199]: Reconfiguring crash.np0005532586 (monmap changed)... Nov 23 04:54:01 localhost ceph-mon[300199]: Reconfiguring daemon crash.np0005532586 on np0005532586.localdomain Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.17319 ' entity='mgr.np0005532585.gzafiw' Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.17319 172.18.0.107:0/1708578753' entity='mgr.np0005532585.gzafiw' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: Activating manager daemon np0005532586.thmvqb Nov 23 04:54:01 localhost ceph-mon[300199]: from='client.? 172.18.0.200:0/3957521171' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:54:01 localhost ceph-mon[300199]: Manager daemon np0005532586.thmvqb is now available Nov 23 04:54:01 localhost ceph-mon[300199]: removing stray HostCache host record np0005532583.localdomain.devices.0 Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"}]': finished Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005532583.localdomain.devices.0"}]': finished Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/mirror_snapshot_schedule"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/mirror_snapshot_schedule"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/trash_purge_schedule"} : dispatch Nov 23 04:54:01 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532586.thmvqb/trash_purge_schedule"} : dispatch Nov 23 04:54:01 localhost ceph-mgr[288287]: mgr[py] Loading python module 'crash' Nov 23 04:54:01 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:01.549+0000 7f5982561140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 23 04:54:01 localhost ceph-mgr[288287]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 23 04:54:01 localhost ceph-mgr[288287]: mgr[py] Loading python module 'dashboard' Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'devicehealth' Nov 23 04:54:02 localhost podman[306093]: 2025-11-23 09:54:02.069292176 +0000 UTC m=+0.079219966 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, GIT_BRANCH=main) Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.096+0000 7f5982561140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'diskprediction_local' Nov 23 04:54:02 localhost podman[306093]: 2025-11-23 09:54:02.179376912 +0000 UTC m=+0.189304732 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, ceph=True, GIT_BRANCH=main, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, name=rhceph) Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: from numpy import show_config as show_numpy_config Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.245+0000 7f5982561140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'influx' Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.312+0000 7f5982561140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'insights' Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'iostat' Nov 23 04:54:02 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:02.433+0000 7f5982561140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'k8sevents' Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'localpool' Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'mds_autoscaler' Nov 23 04:54:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:02 localhost ceph-mgr[288287]: mgr[py] Loading python module 'mirroring' Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'nfs' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.159+0000 7f5982561140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'orchestrator' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.301+0000 7f5982561140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'osd_perf_query' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.367+0000 7f5982561140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'osd_support' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.421+0000 7f5982561140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'pg_autoscaler' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.487+0000 7f5982561140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'progress' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.546+0000 7f5982561140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'prometheus' Nov 23 04:54:03 localhost ceph-mon[300199]: [23/Nov/2025:09:54:01] ENGINE Bus STARTING Nov 23 04:54:03 localhost ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Serving on http://172.18.0.108:8765 Nov 23 04:54:03 localhost ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Serving on https://172.18.0.108:7150 Nov 23 04:54:03 localhost ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Bus STARTED Nov 23 04:54:03 localhost ceph-mon[300199]: [23/Nov/2025:09:54:02] ENGINE Client ('172.18.0.108', 35224) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.846+0000 7f5982561140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rbd_support' Nov 23 04:54:03 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:03.927+0000 7f5982561140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 23 04:54:03 localhost ceph-mgr[288287]: mgr[py] Loading python module 'restful' Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rgw' Nov 23 04:54:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.251+0000 7f5982561140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'rook' Nov 23 04:54:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.671+0000 7f5982561140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'selftest' Nov 23 04:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:54:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:54:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.731+0000 7f5982561140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'snap_schedule' Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'stats' Nov 23 04:54:04 localhost podman[306387]: 2025-11-23 09:54:04.828455548 +0000 UTC m=+0.097071646 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'status' Nov 23 04:54:04 localhost systemd[1]: tmp-crun.iPaacG.mount: Deactivated successfully. Nov 23 04:54:04 localhost podman[306389]: 2025-11-23 09:54:04.905182885 +0000 UTC m=+0.166996363 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 04:54:04 localhost podman[306389]: 2025-11-23 09:54:04.911991136 +0000 UTC m=+0.173804604 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=) Nov 23 04:54:04 localhost podman[306388]: 2025-11-23 09:54:04.920427006 +0000 UTC m=+0.189076955 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 23 04:54:04 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:54:04 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:54:04 localhost podman[306388]: 2025-11-23 09:54:04.92737886 +0000 UTC m=+0.196028769 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 23 04:54:04 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:54:04 localhost podman[306387]: 2025-11-23 09:54:04.962326078 +0000 UTC m=+0.230942156 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:54:04 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:04.967+0000 7f5982561140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 23 04:54:04 localhost ceph-mgr[288287]: mgr[py] Loading python module 'telegraf' Nov 23 04:54:04 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:54:05 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.027+0000 7f5982561140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Loading python module 'telemetry' Nov 23 04:54:05 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.166+0000 7f5982561140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Loading python module 'test_orchestrator' Nov 23 04:54:05 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.314+0000 7f5982561140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Loading python module 'volumes' Nov 23 04:54:05 localhost nova_compute[281952]: 2025-11-23 09:54:05.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:05 localhost nova_compute[281952]: 2025-11-23 09:54:05.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:05 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.503+0000 7f5982561140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Loading python module 'zabbix' Nov 23 04:54:05 localhost ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-mgr-np0005532585-gzafiw[288283]: 2025-11-23T09:54:05.561+0000 7f5982561140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 23 04:54:05 localhost ceph-mgr[288287]: ms_deliver_dispatch: unhandled message 0x558ea9c4d600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0 Nov 23 04:54:05 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.108:6810/335107178 Nov 23 04:54:05 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:54:05 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:54:05 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:54:05 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:54:05 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:54:05 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:54:05 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:54:05 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:54:05 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:54:06 localhost nova_compute[281952]: 2025-11-23 09:54:06.247 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:06 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:54:06 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:54:06 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:54:06 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:54:06 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:54:07 localhost nova_compute[281952]: 2025-11-23 09:54:07.208 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:07 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:54:07 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:54:07 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:54:07 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:07 localhost ceph-mon[300199]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 23 04:54:07 localhost ceph-mon[300199]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 23 04:54:07 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.521 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.522 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.522 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:54:08 localhost nova_compute[281952]: 2025-11-23 09:54:08.523 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:54:08 localhost ceph-mon[300199]: Reconfiguring daemon osd.2 on np0005532584.localdomain Nov 23 04:54:08 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:08 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:08 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:08 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 23 04:54:08 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.020 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.045 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.046 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.046 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.047 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:09 localhost nova_compute[281952]: 2025-11-23 09:54:09.047 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:54:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:54:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:54:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:54:09.292 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:54:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:54:09.293 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:54:09 localhost ceph-mon[300199]: Reconfiguring daemon osd.5 on np0005532584.localdomain Nov 23 04:54:09 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:09 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:09 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:09 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:09 localhost ceph-mon[300199]: Reconfiguring osd.1 (monmap changed)... Nov 23 04:54:09 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 23 04:54:09 localhost ceph-mon[300199]: Reconfiguring daemon osd.1 on np0005532586.localdomain Nov 23 04:54:10 localhost nova_compute[281952]: 2025-11-23 09:54:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:10 localhost nova_compute[281952]: 2025-11-23 09:54:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:10 localhost nova_compute[281952]: 2025-11-23 09:54:10.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:10 localhost nova_compute[281952]: 2025-11-23 09:54:10.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.229 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.230 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.230 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.231 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.231 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:54:11 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:54:11 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1079439071' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.687 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.756 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:54:11 localhost nova_compute[281952]: 2025-11-23 09:54:11.756 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:11 localhost ceph-mon[300199]: Reconfiguring osd.4 (monmap changed)... Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 23 04:54:11 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:11 localhost ceph-mon[300199]: Reconfiguring daemon osd.4 on np0005532586.localdomain Nov 23 04:54:11 localhost podman[240668]: time="2025-11-23T09:54:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:54:11 localhost podman[240668]: @ - - [23/Nov/2025:09:54:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:54:11 localhost podman[240668]: @ - - [23/Nov/2025:09:54:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18716 "" "Go-http-client/1.1" Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.007 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11713MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.008 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.076 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.077 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.077 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.100 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.133 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.134 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.155 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.197 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.246 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:54:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:54:12 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/285259482' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.668 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.674 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.694 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.697 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:54:12 localhost nova_compute[281952]: 2025-11-23 09:54:12.697 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:54:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: Reconfiguring mds.mds.np0005532586.mfohsb (monmap changed)... Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: Reconfiguring daemon mds.mds.np0005532586.mfohsb on np0005532586.localdomain Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005532586.mfohsb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:54:12 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:13 localhost nova_compute[281952]: 2025-11-23 09:54:13.699 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:54:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:54:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:54:14 localhost podman[307117]: 2025-11-23 09:54:14.020505339 +0000 UTC m=+0.076373837 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:54:14 localhost podman[307117]: 2025-11-23 09:54:14.034581103 +0000 UTC m=+0.090449601 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:54:14 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:54:14 localhost podman[307116]: 2025-11-23 09:54:14.114049975 +0000 UTC m=+0.169916603 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 23 04:54:14 localhost podman[307116]: 2025-11-23 09:54:14.12816019 +0000 UTC m=+0.184026828 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true) Nov 23 04:54:14 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:54:15 localhost nova_compute[281952]: 2025-11-23 09:54:15.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:16 localhost ceph-mon[300199]: Saving service mon spec with placement label:mon Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:16 localhost ceph-mon[300199]: Reconfiguring mon.np0005532584 (monmap changed)... Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:54:16 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532584 on np0005532584.localdomain Nov 23 04:54:16 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:16 localhost podman[307225]: Nov 23 04:54:16 localhost podman[307225]: 2025-11-23 09:54:16.987959427 +0000 UTC m=+0.075652785 container create cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container) Nov 23 04:54:17 localhost systemd[1]: Started libpod-conmon-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope. Nov 23 04:54:17 localhost systemd[1]: Started libcrun container. Nov 23 04:54:17 localhost podman[307225]: 2025-11-23 09:54:16.957551749 +0000 UTC m=+0.045245167 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 23 04:54:17 localhost podman[307225]: 2025-11-23 09:54:17.056902794 +0000 UTC m=+0.144596142 container init cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, release=553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 23 04:54:17 localhost podman[307225]: 2025-11-23 09:54:17.067766939 +0000 UTC m=+0.155460287 container start cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Nov 23 04:54:17 localhost podman[307225]: 2025-11-23 09:54:17.068195893 +0000 UTC m=+0.155889281 container attach cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, name=rhceph, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=) Nov 23 04:54:17 localhost determined_williamson[307240]: 167 167 Nov 23 04:54:17 localhost systemd[1]: libpod-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope: Deactivated successfully. Nov 23 04:54:17 localhost podman[307225]: 2025-11-23 09:54:17.071778603 +0000 UTC m=+0.159472051 container died cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public) Nov 23 04:54:17 localhost podman[307245]: 2025-11-23 09:54:17.169306332 +0000 UTC m=+0.086213401 container remove cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_williamson, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 23 04:54:17 localhost systemd[1]: libpod-conmon-cbc9b881cc36a097741558f7d1ee2bdcd5d5c56a2889e91e608c84addec96860.scope: Deactivated successfully. Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:17 localhost ceph-mon[300199]: Reconfiguring mon.np0005532585 (monmap changed)... Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:54:17 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532585 on np0005532585.localdomain Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:17 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 23 04:54:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:17 localhost systemd[1]: var-lib-containers-storage-overlay-6ef8a841a0ee9a462d743fb83008423335eef41a1bcc46f3912cc9f5eac26fdb-merged.mount: Deactivated successfully. Nov 23 04:54:18 localhost ceph-mon[300199]: Reconfiguring mon.np0005532586 (monmap changed)... Nov 23 04:54:18 localhost ceph-mon[300199]: Reconfiguring daemon mon.np0005532586 on np0005532586.localdomain Nov 23 04:54:18 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:18 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:54:19 localhost sshd[307262]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:54:20 localhost nova_compute[281952]: 2025-11-23 09:54:20.428 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:54:25 localhost podman[307264]: 2025-11-23 09:54:25.040572013 +0000 UTC m=+0.092792934 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:54:25 localhost podman[307264]: 2025-11-23 09:54:25.08132797 +0000 UTC m=+0.133548951 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:54:25 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:54:25 localhost nova_compute[281952]: 2025-11-23 09:54:25.430 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:54:27 localhost systemd[1]: tmp-crun.UfsKap.mount: Deactivated successfully. Nov 23 04:54:27 localhost podman[307282]: 2025-11-23 09:54:27.03036707 +0000 UTC m=+0.086327354 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:54:27 localhost podman[307282]: 2025-11-23 09:54:27.045343162 +0000 UTC m=+0.101303456 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:54:27 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.211359) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667211401, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1666, "num_deletes": 255, "total_data_size": 5704233, "memory_usage": 5979112, "flush_reason": "Manual Compaction"} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667228512, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3362442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15314, "largest_seqno": 16975, "table_properties": {"data_size": 3355375, "index_size": 3892, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18742, "raw_average_key_size": 22, "raw_value_size": 3339894, "raw_average_value_size": 4004, "num_data_blocks": 170, "num_entries": 834, "num_filter_entries": 834, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891633, "oldest_key_time": 1763891633, "file_creation_time": 1763891667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 17188 microseconds, and 7932 cpu microseconds. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.228549) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3362442 bytes OK Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.228570) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232642) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232670) EVENT_LOG_v1 {"time_micros": 1763891667232663, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.232693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 5695784, prev total WAL file size 5696533, number of live WAL files 2. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.234020) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373536' seq:72057594037927935, type:22 .. '6D6772737461740034303038' seq:0, type:0; will stop at (end) Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(3283KB)], [21(16MB)] Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667234084, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 20587021, "oldest_snapshot_seqno": -1} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12208 keys, 18323804 bytes, temperature: kUnknown Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667319029, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 18323804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18254647, "index_size": 37568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 325777, "raw_average_key_size": 26, "raw_value_size": 18047217, "raw_average_value_size": 1478, "num_data_blocks": 1437, "num_entries": 12208, "num_filter_entries": 12208, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891667, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.319402) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 18323804 bytes Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.321362) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 241.9 rd, 215.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 16.4 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(11.6) write-amplify(5.4) OK, records in: 12736, records dropped: 528 output_compression: NoCompression Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.321391) EVENT_LOG_v1 {"time_micros": 1763891667321378, "job": 10, "event": "compaction_finished", "compaction_time_micros": 85092, "compaction_time_cpu_micros": 52007, "output_level": 6, "num_output_files": 1, "total_output_size": 18323804, "num_input_records": 12736, "num_output_records": 12208, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667322103, "job": 10, "event": "table_file_deletion", "file_number": 23} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891667324725, "job": 10, "event": "table_file_deletion", "file_number": 21} Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.233955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324835) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324837) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:54:27.324841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:54:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:29 localhost openstack_network_exporter[242668]: ERROR 09:54:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:54:29 localhost openstack_network_exporter[242668]: ERROR 09:54:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:54:29 localhost openstack_network_exporter[242668]: ERROR 09:54:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:54:29 localhost openstack_network_exporter[242668]: ERROR 09:54:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:54:29 localhost openstack_network_exporter[242668]: Nov 23 04:54:29 localhost openstack_network_exporter[242668]: ERROR 09:54:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:54:29 localhost openstack_network_exporter[242668]: Nov 23 04:54:30 localhost nova_compute[281952]: 2025-11-23 09:54:30.432 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:54:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.435 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.436 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.437 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:54:35 localhost nova_compute[281952]: 2025-11-23 09:54:35.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:54:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:54:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:54:36 localhost systemd[1]: tmp-crun.2CCjJg.mount: Deactivated successfully. Nov 23 04:54:36 localhost podman[307307]: 2025-11-23 09:54:36.039997892 +0000 UTC m=+0.092923047 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 04:54:36 localhost podman[307307]: 2025-11-23 09:54:36.078193581 +0000 UTC m=+0.131118666 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 04:54:36 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:54:36 localhost podman[307305]: 2025-11-23 09:54:36.1310051 +0000 UTC m=+0.187081832 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:54:36 localhost podman[307306]: 2025-11-23 09:54:36.181715685 +0000 UTC m=+0.234505786 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:54:36 localhost podman[307306]: 2025-11-23 09:54:36.19129472 +0000 UTC m=+0.244084821 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:54:36 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:54:36 localhost podman[307305]: 2025-11-23 09:54:36.249396413 +0000 UTC m=+0.305473165 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:54:36 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:54:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:40 localhost nova_compute[281952]: 2025-11-23 09:54:40.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:54:41 localhost podman[240668]: time="2025-11-23T09:54:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:54:41 localhost podman[240668]: @ - - [23/Nov/2025:09:54:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:54:41 localhost podman[240668]: @ - - [23/Nov/2025:09:54:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18714 "" "Go-http-client/1.1" Nov 23 04:54:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:54:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:54:45 localhost podman[307368]: 2025-11-23 09:54:45.034060815 +0000 UTC m=+0.077317357 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:54:45 localhost podman[307367]: 2025-11-23 09:54:45.090358771 +0000 UTC m=+0.134427358 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:54:45 localhost podman[307368]: 2025-11-23 09:54:45.099161273 +0000 UTC m=+0.142417765 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:54:45 localhost podman[307367]: 2025-11-23 09:54:45.106536231 +0000 UTC m=+0.150604868 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd) Nov 23 04:54:45 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:54:45 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:54:45 localhost nova_compute[281952]: 2025-11-23 09:54:45.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:50 localhost nova_compute[281952]: 2025-11-23 09:54:50.444 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 23 04:54:54 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1282514036' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 23 04:54:55 localhost nova_compute[281952]: 2025-11-23 09:54:55.448 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:54:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:54:56 localhost podman[307409]: 2025-11-23 09:54:56.039157975 +0000 UTC m=+0.097111206 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute) Nov 23 04:54:56 localhost podman[307409]: 2025-11-23 09:54:56.050175242 +0000 UTC m=+0.108128493 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Nov 23 04:54:56 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:54:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:54:58 localhost podman[307428]: 2025-11-23 09:54:58.03345967 +0000 UTC m=+0.085166631 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:54:58 localhost podman[307428]: 2025-11-23 09:54:58.047325004 +0000 UTC m=+0.099031955 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:54:58 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:54:58 localhost sshd[307451]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:54:59 localhost openstack_network_exporter[242668]: ERROR 09:54:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:54:59 localhost openstack_network_exporter[242668]: ERROR 09:54:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:54:59 localhost openstack_network_exporter[242668]: ERROR 09:54:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:54:59 localhost openstack_network_exporter[242668]: ERROR 09:54:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:54:59 localhost openstack_network_exporter[242668]: Nov 23 04:54:59 localhost openstack_network_exporter[242668]: ERROR 09:54:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:54:59 localhost openstack_network_exporter[242668]: Nov 23 04:55:00 localhost nova_compute[281952]: 2025-11-23 09:55:00.449 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.248088) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702248128, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 637, "num_deletes": 256, "total_data_size": 552338, "memory_usage": 564184, "flush_reason": "Manual Compaction"} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702254044, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 353856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16980, "largest_seqno": 17612, "table_properties": {"data_size": 350917, "index_size": 922, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6845, "raw_average_key_size": 18, "raw_value_size": 344937, "raw_average_value_size": 927, "num_data_blocks": 41, "num_entries": 372, "num_filter_entries": 372, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891667, "oldest_key_time": 1763891667, "file_creation_time": 1763891702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 6009 microseconds, and 2283 cpu microseconds. Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.254096) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 353856 bytes OK Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.254117) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257195) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257218) EVENT_LOG_v1 {"time_micros": 1763891702257211, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257237) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 548787, prev total WAL file size 549111, number of live WAL files 2. Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.258040) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373733' seq:72057594037927935, type:22 .. '6C6F676D0034303235' seq:0, type:0; will stop at (end) Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(345KB)], [24(17MB)] Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702258141, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 18677660, "oldest_snapshot_seqno": -1} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12055 keys, 18578998 bytes, temperature: kUnknown Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702347251, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18578998, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18509741, "index_size": 38052, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323539, "raw_average_key_size": 26, "raw_value_size": 18303804, "raw_average_value_size": 1518, "num_data_blocks": 1455, "num_entries": 12055, "num_filter_entries": 12055, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.347579) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18578998 bytes Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.349749) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 209.4 rd, 208.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 17.5 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(105.3) write-amplify(52.5) OK, records in: 12580, records dropped: 525 output_compression: NoCompression Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.349779) EVENT_LOG_v1 {"time_micros": 1763891702349766, "job": 12, "event": "compaction_finished", "compaction_time_micros": 89184, "compaction_time_cpu_micros": 49321, "output_level": 6, "num_output_files": 1, "total_output_size": 18578998, "num_input_records": 12580, "num_output_records": 12055, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702349987, "job": 12, "event": "table_file_deletion", "file_number": 26} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891702352404, "job": 12, "event": "table_file_deletion", "file_number": 24} Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.257879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:02.352465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:05 localhost nova_compute[281952]: 2025-11-23 09:55:05.452 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:06 localhost nova_compute[281952]: 2025-11-23 09:55:06.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:55:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:55:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:55:07 localhost podman[307453]: 2025-11-23 09:55:07.031214495 +0000 UTC m=+0.083379976 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 23 04:55:07 localhost systemd[1]: tmp-crun.Wzehwb.mount: Deactivated successfully. Nov 23 04:55:07 localhost podman[307455]: 2025-11-23 09:55:07.098878661 +0000 UTC m=+0.144382658 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm) Nov 23 04:55:07 localhost podman[307455]: 2025-11-23 09:55:07.111132985 +0000 UTC m=+0.156636972 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Nov 23 04:55:07 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:55:07 localhost podman[307453]: 2025-11-23 09:55:07.128078862 +0000 UTC m=+0.180244303 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_controller) Nov 23 04:55:07 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:55:07 localhost podman[307454]: 2025-11-23 09:55:07.181982348 +0000 UTC m=+0.230690744 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 04:55:07 localhost nova_compute[281952]: 2025-11-23 09:55:07.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:07 localhost podman[307454]: 2025-11-23 09:55:07.210779847 +0000 UTC m=+0.259488243 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 23 04:55:07 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:55:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) Nov 23 04:55:07 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3373261952' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch Nov 23 04:55:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:08 localhost nova_compute[281952]: 2025-11-23 09:55:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:08 localhost nova_compute[281952]: 2025-11-23 09:55:08.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:55:09 localhost nova_compute[281952]: 2025-11-23 09:55:09.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:55:09.293 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:55:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:55:09.294 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:55:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:55:09.294 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.454 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.551 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.552 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.552 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:55:10 localhost nova_compute[281952]: 2025-11-23 09:55:10.553 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.810 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.811 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.826 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.827 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '461bb781-f9b6-434c-bdf2-2c827cf6f236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.812103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ff75bac-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': 'f84887a25236d38fb7de0f981be81343c8558a1303a98ef082786858d20eab62'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.812103', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ff777ae-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '8e9611ef5810158105432a388a73212eda1bfa78508e457cb89afec1c05e5ad5'}]}, 'timestamp': '2025-11-23 09:55:10.828590', '_unique_id': '3cc11a1d5d4d49638e049331a9cdde1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.830 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34a252d6-355d-48d8-ac2f-ae68693362ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.832844', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '7ff8fe80-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '598b50ca2630bd815a021ae52fba1ebc367a08e76ec46ea5a30522ff0e2a4b40'}]}, 'timestamp': '2025-11-23 09:55:10.838631', '_unique_id': '8c2e1d1fa30f445ca8a61a23385b574f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3899128-c954-4aa3-99c5-ad1c37a495c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.841880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7ffe4606-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '4d6f467d8c67d1bea9819ae4b9481a519b56cf41a716da1612bfb3f61d11f2ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.841880', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7ffe58ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'a16d72daa859c35470316b0f443f410614435cd2258b794c6100e7cc9d6638ba'}]}, 'timestamp': '2025-11-23 09:55:10.873664', '_unique_id': '40189cabbf1940daadee49af75842e58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c4038e0-79f8-4a29-b2c7-0693a1321e43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.876680', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '7ffee30e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '395f3356e5c33b155a5d5a5f1497582977d9d49fcfbe818d50ab0ede2ac132aa'}]}, 'timestamp': '2025-11-23 09:55:10.877200', '_unique_id': '9531847849064bdf9802a69deca004a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e04464d-9419-4246-80f4-fdfdeeeb7730', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.879538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7fff50f0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'b69935fd959733e0246db9c4914b8f73d0369b1651a722433558d3fa470a98ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.879538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7fff6234-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '899acac412ca443269aa5c213106b7008f58633f6e5d7103738ae73eea3ad532'}]}, 'timestamp': '2025-11-23 09:55:10.880413', '_unique_id': 'd4d596de2e7a45c6afff00b569719dbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c82e02d-b15b-41ec-a9b3-28fa7bbff000', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.882781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7fffd034-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'b92cb3120b1ca125b0418081139e938afe3b13b200711748a5849912a5be2c84'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.882781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7fffdfc0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '80cb80ea9675d074bfbb4277197af0fdc72823d3af87e8d96cab6b02812c08dc'}]}, 'timestamp': '2025-11-23 09:55:10.883623', '_unique_id': '34808c7f1c3a4554864de3b3ed581a66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1b18c5d-64f4-4f22-b10e-203839bcff08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.885800', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80004622-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '7ff59900dda9a6de6ab19d3057cfebe612762f446e743efcb65fe1411f1e312e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.885800', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8000559a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '1eb11d59957b81bc6831bd0e276aa628fe656f5003cda6b2ae45a1ec04fa9917'}]}, 'timestamp': '2025-11-23 09:55:10.886635', '_unique_id': '8b02285fb2a641698168cb1ca8bc19e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '719860d8-13dd-4c1f-8e12-c7d91a1497ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.888958', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8000c08e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'eabfc646825daa9caad977557b7892cb49a7c8d296214cc37e17f07c38be8049'}]}, 'timestamp': '2025-11-23 09:55:10.889408', '_unique_id': 'c8170143589943a49e8a58c68b7b91de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e42eb5e-c21b-4638-92bf-4620c0b5d1c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.891485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '800122a4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '385c7a99f94c8e713637f12eb17a38048755d78444501dd38fd98e01827e012c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.891485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8001333e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11653.989758826, 'message_signature': '57e7b57841082ed6c842b33d31ba38df9af571e888ff14406a9c968372ac3dcb'}]}, 'timestamp': '2025-11-23 09:55:10.892311', '_unique_id': 'a8cbcc9cb8cc43478114a7d8486f7a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.893 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.894 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '184f8b34-67eb-4f73-8e99-eb18b96a8d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.894636', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8001a35a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '7abfc725a105fbdabb920837b700fccc3f9843f2257b06f85ac2d68b1322ae6e'}]}, 'timestamp': '2025-11-23 09:55:10.895288', '_unique_id': '311195af1dea41b68d4cb9e0f45d6c0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 14710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1455f2f3-8980-4a0f-8dba-d1defc9c4480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14710000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:55:10.897729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8004a7ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.091759399, 'message_signature': '24273712293c572f6644f815065cb9cb2a1b981779f78cd277c8599baa56dd85'}]}, 'timestamp': '2025-11-23 09:55:10.915126', '_unique_id': '009b091f9a4d4e40ae9577fcaccb2da0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f040e79e-75b7-426b-8d56-88b1ee6d8ae8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.918112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '80053af6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'e8b11e2bd789c8285d2fc6dbd842111c3202814d1552a4e7cac20631447040ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.918112', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80054ef6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '18a7b913037ab2c1e36fdcace057519e6b3a2d0f28e0f7d8adf8c25cd9ff8c5b'}]}, 'timestamp': '2025-11-23 09:55:10.919245', '_unique_id': '1b9cab1985714b58b1cf15b25e4a19f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.920 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd3c381f-7e7c-4b2c-92dd-16cb3925a96e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.921593', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8005bbd4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'a38f5baa4b1ece17e7ea45a34a1c9db82113219c21b47b4d34732405c13e87b3'}]}, 'timestamp': '2025-11-23 09:55:10.922087', '_unique_id': 'e91e1bd4b54240f1a179a65c60e01e4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c40d189f-985a-40e0-83d1-c047cb28f92a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.924223', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '80062254-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '1c92ec8f6c11eeddbe13aef1198d1a926f44f026c2d9ec1023fa94927180a1a0'}]}, 'timestamp': '2025-11-23 09:55:10.924676', '_unique_id': 'ff63f47aabf24923a56a3124f4a338d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c80c572-4a41-4588-9846-1d155790a092', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.926770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8006873a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '79e492b1420ae37f46b311c98370e8b6ad7ffd63e9d3b637a19fa28a530633f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.926770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80069900-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '722e3627b67b49896fe35a925cdb0e1513d501f7439899f9c633ed349c88c6c6'}]}, 'timestamp': '2025-11-23 09:55:10.927694', '_unique_id': 'cba96551f4ce459a8c9f1441ffc1d5ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8f2e85a-323b-4cfd-aff6-a047d550abfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.930065', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '800709c6-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': 'b7d42dac3d68fd4daaf247634466a145af0b18f6902767c09646ee0d2ccf993b'}]}, 'timestamp': '2025-11-23 09:55:10.930651', '_unique_id': 'c5ddd3d033d74ab58171cec8ad834260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fe48602-7a45-4110-98c0-0102690e1c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:55:10.932823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8007767c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.091759399, 'message_signature': '780232b365cb150951fc3b9c6f802b7aaa54a73dcea8ffb2f9567acc5dcf6f55'}]}, 'timestamp': '2025-11-23 09:55:10.933448', '_unique_id': '6f6a0fab7b154176b33935662afd3b69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.934 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b54372d9-3976-489c-8cfc-05a03dbea9fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.935876', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8007ee4a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '998687858efaf0a125b9f253706ec4313fc913817cb69424b6cfd5f2301233bd'}]}, 'timestamp': '2025-11-23 09:55:10.936459', '_unique_id': '5967c00af50143b194e9e39451a3f9e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aad9812-4cf8-4a4d-9a84-a9320043aa53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.938969', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '800865b4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '06ef85120bfadd16b1dc4f84f67a1326b0aed6cdeabff34e8abc99edbeae73f6'}]}, 'timestamp': '2025-11-23 09:55:10.939558', '_unique_id': 'c6745011888e4ed6adde6aa64ddc4289'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.941 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5890a598-fbaa-4333-8bdd-67bad04f4388', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:55:10.941821', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '8008d67a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.010616202, 'message_signature': '30ecf30e965ad68ce593f9e458e14fe8859664fe8ba6e2e04e02834049b85fb8'}]}, 'timestamp': '2025-11-23 09:55:10.942479', '_unique_id': 'f3237ba01a2646c28903efefc6c656a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.943 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.944 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.945 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '891913eb-9aa2-4340-8bf3-8f669c72ab53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:55:10.944649', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '800943ee-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': '11b15efb534a4aa162fb2b516b6d3414bd598984c9d93cec42123bc92285ef09'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:55:10.944649', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '80095848-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11654.019589676, 'message_signature': 'f00ec1cf9a88335fcc99692fc4f4d8129975a4a92f28148c81edcb3891985c5e'}]}, 'timestamp': '2025-11-23 09:55:10.945707', '_unique_id': '57523b6a4bef47d5bc34af1e5eb110eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:55:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:55:10.946 12 ERROR oslo_messaging.notify.messaging Nov 23 04:55:11 localhost nova_compute[281952]: 2025-11-23 09:55:11.049 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:55:11 localhost nova_compute[281952]: 2025-11-23 09:55:11.086 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:55:11 localhost nova_compute[281952]: 2025-11-23 09:55:11.086 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:55:11 localhost podman[240668]: time="2025-11-23T09:55:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:55:11 localhost podman[240668]: @ - - [23/Nov/2025:09:55:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:55:11 localhost podman[240668]: @ - - [23/Nov/2025:09:55:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18715 "" "Go-http-client/1.1" Nov 23 04:55:12 localhost nova_compute[281952]: 2025-11-23 09:55:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:12 localhost nova_compute[281952]: 2025-11-23 09:55:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:12 localhost nova_compute[281952]: 2025-11-23 09:55:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.232 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:55:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:55:13 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/466749514' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.688 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.764 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.764 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.966 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.968 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11701MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.969 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:55:13 localhost nova_compute[281952]: 2025-11-23 09:55:13.969 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.069 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.070 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.070 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.122 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:55:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:55:14 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1799410222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.626 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.631 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.646 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.647 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:55:14 localhost nova_compute[281952]: 2025-11-23 09:55:14.648 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:55:15 localhost nova_compute[281952]: 2025-11-23 09:55:15.457 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:55:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:55:16 localhost systemd[1]: tmp-crun.55sCPn.mount: Deactivated successfully. Nov 23 04:55:16 localhost podman[307560]: 2025-11-23 09:55:16.034638103 +0000 UTC m=+0.085453671 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:55:16 localhost podman[307559]: 2025-11-23 09:55:16.082578956 +0000 UTC m=+0.135780667 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:55:16 localhost podman[307560]: 2025-11-23 09:55:16.099697169 +0000 UTC m=+0.150512787 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:55:16 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:55:16 localhost podman[307559]: 2025-11-23 09:55:16.122470164 +0000 UTC m=+0.175671825 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd) Nov 23 04:55:16 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:55:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:18 localhost nova_compute[281952]: 2025-11-23 09:55:18.644 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:55:19 localhost ceph-mon[300199]: from='mgr.26494 172.18.0.108:0/345283227' entity='mgr.np0005532586.thmvqb' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:55:19 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:55:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 04:55:20 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1186317306' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 04:55:20 localhost nova_compute[281952]: 2025-11-23 09:55:20.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:21 localhost ceph-mon[300199]: from='mgr.26494 ' entity='mgr.np0005532586.thmvqb' Nov 23 04:55:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 23 04:55:23 localhost ceph-mon[300199]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1136241170' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:55:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 e87: 6 total, 6 up, 6 in Nov 23 04:55:23 localhost systemd[1]: session-70.scope: Deactivated successfully. Nov 23 04:55:23 localhost systemd[1]: session-70.scope: Consumed 7.453s CPU time. Nov 23 04:55:23 localhost systemd-logind[761]: Session 70 logged out. Waiting for processes to exit. Nov 23 04:55:23 localhost systemd-logind[761]: Removed session 70. Nov 23 04:55:23 localhost sshd[307688]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:55:23 localhost systemd-logind[761]: New session 71 of user ceph-admin. Nov 23 04:55:23 localhost systemd[1]: Started Session 71 of User ceph-admin. Nov 23 04:55:23 localhost ceph-mon[300199]: from='client.? 172.18.0.200:0/1136241170' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:55:23 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 23 04:55:23 localhost ceph-mon[300199]: Activating manager daemon np0005532584.naxwxy Nov 23 04:55:23 localhost ceph-mon[300199]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 23 04:55:23 localhost ceph-mon[300199]: Manager daemon np0005532584.naxwxy is now available Nov 23 04:55:23 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/mirror_snapshot_schedule"} : dispatch Nov 23 04:55:23 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005532584.naxwxy/trash_purge_schedule"} : dispatch Nov 23 04:55:24 localhost podman[307800]: 2025-11-23 09:55:24.719879457 +0000 UTC m=+0.094282240 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, vcs-type=git, release=553, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=) Nov 23 04:55:24 localhost podman[307800]: 2025-11-23 09:55:24.827583765 +0000 UTC m=+0.201986488 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 23 04:55:25 localhost nova_compute[281952]: 2025-11-23 09:55:25.464 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:25 localhost nova_compute[281952]: 2025-11-23 09:55:25.470 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:26 localhost ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Bus STARTING Nov 23 04:55:26 localhost ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Serving on https://172.18.0.106:7150 Nov 23 04:55:26 localhost ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Client ('172.18.0.106', 60482) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 23 04:55:26 localhost ceph-mon[300199]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 23 04:55:26 localhost ceph-mon[300199]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 23 04:55:26 localhost ceph-mon[300199]: Cluster is now healthy Nov 23 04:55:26 localhost ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Serving on http://172.18.0.106:8765 Nov 23 04:55:26 localhost ceph-mon[300199]: [23/Nov/2025:09:55:25] ENGINE Bus STARTED Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:55:26 localhost podman[308000]: 2025-11-23 09:55:26.507019837 +0000 UTC m=+0.102596314 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm) Nov 23 04:55:26 localhost podman[308000]: 2025-11-23 09:55:26.52416991 +0000 UTC m=+0.119746417 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 04:55:26 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.283829) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727283871, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 767, "num_deletes": 258, "total_data_size": 2031449, "memory_usage": 2157712, "flush_reason": "Manual Compaction"} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727292105, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1315966, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17617, "largest_seqno": 18379, "table_properties": {"data_size": 1312190, "index_size": 1503, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 8763, "raw_average_key_size": 19, "raw_value_size": 1304214, "raw_average_value_size": 2829, "num_data_blocks": 62, "num_entries": 461, "num_filter_entries": 461, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891702, "oldest_key_time": 1763891702, "file_creation_time": 1763891727, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8332 microseconds, and 3930 cpu microseconds. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.292157) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1315966 bytes OK Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.292180) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.293958) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.293983) EVENT_LOG_v1 {"time_micros": 1763891727293975, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294005) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2027190, prev total WAL file size 2027190, number of live WAL files 2. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294660) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353237' seq:72057594037927935, type:22 .. '6B760031373835' seq:0, type:0; will stop at (end) Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1285KB)], [27(17MB)] Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727294700, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19894964, "oldest_snapshot_seqno": -1} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11969 keys, 18720388 bytes, temperature: kUnknown Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727372765, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 18720388, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18652335, "index_size": 37040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29957, "raw_key_size": 323417, "raw_average_key_size": 27, "raw_value_size": 18448355, "raw_average_value_size": 1541, "num_data_blocks": 1396, "num_entries": 11969, "num_filter_entries": 11969, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891727, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.373428) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 18720388 bytes Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.375734) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 254.3 rd, 239.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 17.7 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(29.3) write-amplify(14.2) OK, records in: 12516, records dropped: 547 output_compression: NoCompression Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.375820) EVENT_LOG_v1 {"time_micros": 1763891727375800, "job": 14, "event": "compaction_finished", "compaction_time_micros": 78246, "compaction_time_cpu_micros": 46871, "output_level": 6, "num_output_files": 1, "total_output_size": 18720388, "num_input_records": 12516, "num_output_records": 11969, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727376434, "job": 14, "event": "table_file_deletion", "file_number": 29} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891727379866, "job": 14, "event": "table_file_deletion", "file_number": 27} Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.294609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:27.380036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:55:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:55:28 localhost podman[308357]: 2025-11-23 09:55:28.244296615 +0000 UTC m=+0.082891281 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:55:28 localhost podman[308357]: 2025-11-23 09:55:28.283315496 +0000 UTC m=+0.121910192 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:55:28 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:55:28 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 04:55:28 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:55:28 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 04:55:28 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 04:55:28 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 04:55:28 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.conf Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.conf Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.conf Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:55:28 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.conf Nov 23 04:55:29 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:55:29 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:55:29 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 23 04:55:29 localhost ceph-mon[300199]: Updating np0005532584.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:55:29 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:29 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.687100) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729687132, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 374, "num_deletes": 251, "total_data_size": 643977, "memory_usage": 651848, "flush_reason": "Manual Compaction"} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729690995, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 427045, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18384, "largest_seqno": 18753, "table_properties": {"data_size": 424623, "index_size": 533, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6885, "raw_average_key_size": 21, "raw_value_size": 419482, "raw_average_value_size": 1282, "num_data_blocks": 20, "num_entries": 327, "num_filter_entries": 327, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891727, "oldest_key_time": 1763891727, "file_creation_time": 1763891729, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 3945 microseconds, and 1628 cpu microseconds. Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.691042) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 427045 bytes OK Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.691064) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694009) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694033) EVENT_LOG_v1 {"time_micros": 1763891729694026, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.694054) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 641384, prev total WAL file size 657562, number of live WAL files 2. Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.695610) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(417KB)], [30(17MB)] Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729695679, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19147433, "oldest_snapshot_seqno": -1} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11777 keys, 16438704 bytes, temperature: kUnknown Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729771725, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 16438704, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16374064, "index_size": 34075, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29509, "raw_key_size": 320096, "raw_average_key_size": 27, "raw_value_size": 16175487, "raw_average_value_size": 1373, "num_data_blocks": 1268, "num_entries": 11777, "num_filter_entries": 11777, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891729, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.772094) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 16438704 bytes Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.773869) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 251.5 rd, 215.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 17.9 +0.0 blob) out(15.7 +0.0 blob), read-write-amplify(83.3) write-amplify(38.5) OK, records in: 12296, records dropped: 519 output_compression: NoCompression Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.773952) EVENT_LOG_v1 {"time_micros": 1763891729773938, "job": 16, "event": "compaction_finished", "compaction_time_micros": 76136, "compaction_time_cpu_micros": 47638, "output_level": 6, "num_output_files": 1, "total_output_size": 16438704, "num_input_records": 12296, "num_output_records": 11777, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729774155, "job": 16, "event": "table_file_deletion", "file_number": 32} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891729776660, "job": 16, "event": "table_file_deletion", "file_number": 30} Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.695464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776705) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:55:29.776710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:55:29 localhost openstack_network_exporter[242668]: ERROR 09:55:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:55:29 localhost openstack_network_exporter[242668]: ERROR 09:55:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:55:29 localhost openstack_network_exporter[242668]: ERROR 09:55:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:55:30 localhost openstack_network_exporter[242668]: ERROR 09:55:30 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:55:30 localhost openstack_network_exporter[242668]: Nov 23 04:55:30 localhost openstack_network_exporter[242668]: ERROR 09:55:30 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:55:30 localhost openstack_network_exporter[242668]: Nov 23 04:55:30 localhost nova_compute[281952]: 2025-11-23 09:55:30.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:30 localhost nova_compute[281952]: 2025-11-23 09:55:30.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:30 localhost ceph-mon[300199]: Updating np0005532585.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:55:30 localhost ceph-mon[300199]: Updating np0005532586.localdomain:/var/lib/ceph/46550e70-79cb-5f55-bf6d-1204b97e083b/config/ceph.client.admin.keyring Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:55:30 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:55:35 localhost nova_compute[281952]: 2025-11-23 09:55:35.470 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:55:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:55:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:55:38 localhost podman[308755]: 2025-11-23 09:55:38.037962479 +0000 UTC m=+0.088874635 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:55:38 localhost podman[308755]: 2025-11-23 09:55:38.045230001 +0000 UTC m=+0.096142127 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 23 04:55:38 localhost podman[308756]: 2025-11-23 09:55:38.014041639 +0000 UTC m=+0.065915594 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9) Nov 23 04:55:38 localhost podman[308754]: 2025-11-23 09:55:38.081573491 +0000 UTC m=+0.133228639 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:55:38 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:55:38 localhost podman[308756]: 2025-11-23 09:55:38.147706909 +0000 UTC m=+0.199580884 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Nov 23 04:55:38 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:55:38 localhost podman[308754]: 2025-11-23 09:55:38.187256027 +0000 UTC m=+0.238911195 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:55:38 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:55:40 localhost nova_compute[281952]: 2025-11-23 09:55:40.473 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:41 localhost podman[240668]: time="2025-11-23T09:55:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:55:41 localhost podman[240668]: @ - - [23/Nov/2025:09:55:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:55:41 localhost podman[240668]: @ - - [23/Nov/2025:09:55:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18725 "" "Go-http-client/1.1" Nov 23 04:55:41 localhost sshd[308817]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:55:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:45 localhost nova_compute[281952]: 2025-11-23 09:55:45.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:55:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:55:47 localhost podman[308819]: 2025-11-23 09:55:47.023854812 +0000 UTC m=+0.077167317 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:55:47 localhost podman[308819]: 2025-11-23 09:55:47.034061234 +0000 UTC m=+0.087373679 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 04:55:47 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:55:47 localhost systemd[1]: tmp-crun.m9U4ra.mount: Deactivated successfully. Nov 23 04:55:47 localhost podman[308820]: 2025-11-23 09:55:47.083630407 +0000 UTC m=+0.132832016 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:55:47 localhost podman[308820]: 2025-11-23 09:55:47.09194699 +0000 UTC m=+0.141148649 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:55:47 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:55:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:50 localhost nova_compute[281952]: 2025-11-23 09:55:50.478 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:55 localhost nova_compute[281952]: 2025-11-23 09:55:55.481 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:55:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:55:57 localhost systemd[1]: tmp-crun.sxjkl3.mount: Deactivated successfully. Nov 23 04:55:57 localhost podman[308861]: 2025-11-23 09:55:57.015858521 +0000 UTC m=+0.077326642 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:55:57 localhost podman[308861]: 2025-11-23 09:55:57.027507346 +0000 UTC m=+0.088975407 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:55:57 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:55:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:55:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:55:59 localhost podman[308880]: 2025-11-23 09:55:59.003865013 +0000 UTC m=+0.060914830 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:55:59 localhost podman[308880]: 2025-11-23 09:55:59.03618314 +0000 UTC m=+0.093232897 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:55:59 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:55:59 localhost openstack_network_exporter[242668]: ERROR 09:55:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:55:59 localhost openstack_network_exporter[242668]: ERROR 09:55:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:55:59 localhost openstack_network_exporter[242668]: ERROR 09:55:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:55:59 localhost openstack_network_exporter[242668]: ERROR 09:55:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:55:59 localhost openstack_network_exporter[242668]: Nov 23 04:55:59 localhost openstack_network_exporter[242668]: ERROR 09:55:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:55:59 localhost openstack_network_exporter[242668]: Nov 23 04:56:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 04:56:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 04:56:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 04:56:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1817100876' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 04:56:00 localhost nova_compute[281952]: 2025-11-23 09:56:00.484 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:04 localhost sshd[308904]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:56:05 localhost nova_compute[281952]: 2025-11-23 09:56:05.486 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:07 localhost nova_compute[281952]: 2025-11-23 09:56:07.229 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:08 localhost nova_compute[281952]: 2025-11-23 09:56:08.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:56:09 localhost systemd[1]: tmp-crun.3wPmKs.mount: Deactivated successfully. Nov 23 04:56:09 localhost podman[308906]: 2025-11-23 09:56:09.02106616 +0000 UTC m=+0.078843818 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 04:56:09 localhost podman[308907]: 2025-11-23 09:56:09.075827752 +0000 UTC m=+0.129944759 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 04:56:09 localhost podman[308907]: 2025-11-23 09:56:09.10916601 +0000 UTC m=+0.163283017 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:56:09 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:56:09 localhost podman[308906]: 2025-11-23 09:56:09.159195486 +0000 UTC m=+0.216973224 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:56:09 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:56:09 localhost podman[308908]: 2025-11-23 09:56:09.235705483 +0000 UTC m=+0.286652763 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 04:56:09 localhost podman[308908]: 2025-11-23 09:56:09.251454083 +0000 UTC m=+0.302401413 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Nov 23 04:56:09 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:56:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:09.295 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:56:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:09.295 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:56:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:09.296 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:56:10 localhost systemd[1]: tmp-crun.tEtryB.mount: Deactivated successfully. Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.487 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.604 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.604 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.605 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:56:10 localhost nova_compute[281952]: 2025-11-23 09:56:10.605 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.007 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.021 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.022 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.023 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.023 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:56:11 localhost nova_compute[281952]: 2025-11-23 09:56:11.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:11 localhost podman[240668]: time="2025-11-23T09:56:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:56:11 localhost podman[240668]: @ - - [23/Nov/2025:09:56:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:56:11 localhost podman[240668]: @ - - [23/Nov/2025:09:56:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18723 "" "Go-http-client/1.1" Nov 23 04:56:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.236 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.236 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:56:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:56:13 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1193741643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.709 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.759 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.947 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.948 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11697MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.948 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.949 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:56:13 localhost nova_compute[281952]: 2025-11-23 09:56:13.998 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.037 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:56:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:56:14 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2435005534' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.479 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.486 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.506 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.508 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:56:14 localhost nova_compute[281952]: 2025-11-23 09:56:14.509 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:56:15 localhost nova_compute[281952]: 2025-11-23 09:56:15.489 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:15 localhost nova_compute[281952]: 2025-11-23 09:56:15.508 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:56:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:56:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:56:18 localhost podman[309010]: 2025-11-23 09:56:18.021029182 +0000 UTC m=+0.073663240 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:56:18 localhost podman[309010]: 2025-11-23 09:56:18.037069692 +0000 UTC m=+0.089703750 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, container_name=multipathd, managed_by=edpm_ansible) Nov 23 04:56:18 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:56:18 localhost podman[309011]: 2025-11-23 09:56:18.131832315 +0000 UTC m=+0.182196824 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:56:18 localhost podman[309011]: 2025-11-23 09:56:18.144474171 +0000 UTC m=+0.194838750 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:56:18 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:56:20 localhost nova_compute[281952]: 2025-11-23 09:56:20.491 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:25 localhost nova_compute[281952]: 2025-11-23 09:56:25.493 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:56:28 localhost systemd[1]: tmp-crun.zRNKzg.mount: Deactivated successfully. Nov 23 04:56:28 localhost podman[309050]: 2025-11-23 09:56:28.021917583 +0000 UTC m=+0.083453769 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:56:28 localhost podman[309050]: 2025-11-23 09:56:28.030084962 +0000 UTC m=+0.091621158 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 04:56:28 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:56:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:56:30 localhost openstack_network_exporter[242668]: ERROR 09:56:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:56:30 localhost openstack_network_exporter[242668]: ERROR 09:56:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:56:30 localhost openstack_network_exporter[242668]: ERROR 09:56:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:56:30 localhost openstack_network_exporter[242668]: ERROR 09:56:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:56:30 localhost openstack_network_exporter[242668]: Nov 23 04:56:30 localhost openstack_network_exporter[242668]: ERROR 09:56:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:56:30 localhost openstack_network_exporter[242668]: Nov 23 04:56:30 localhost systemd[1]: tmp-crun.kq3e1B.mount: Deactivated successfully. Nov 23 04:56:30 localhost podman[309069]: 2025-11-23 09:56:30.060695176 +0000 UTC m=+0.117707045 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:56:30 localhost podman[309069]: 2025-11-23 09:56:30.097414177 +0000 UTC m=+0.154426026 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:56:30 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:56:30 localhost nova_compute[281952]: 2025-11-23 09:56:30.495 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:56:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:56:35 localhost nova_compute[281952]: 2025-11-23 09:56:35.497 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:56:40 localhost systemd[1]: tmp-crun.Bz0GOG.mount: Deactivated successfully. Nov 23 04:56:40 localhost podman[309233]: 2025-11-23 09:56:40.037307095 +0000 UTC m=+0.091459153 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:56:40 localhost podman[309234]: 2025-11-23 09:56:40.080367829 +0000 UTC m=+0.133492425 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:56:40 localhost podman[309235]: 2025-11-23 09:56:40.142494256 +0000 UTC m=+0.194004654 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9) Nov 23 04:56:40 localhost podman[309235]: 2025-11-23 09:56:40.159437003 +0000 UTC m=+0.210947401 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal) Nov 23 04:56:40 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:56:40 localhost podman[309234]: 2025-11-23 09:56:40.211100911 +0000 UTC m=+0.264225507 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Nov 23 04:56:40 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:56:40 localhost podman[309233]: 2025-11-23 09:56:40.261608353 +0000 UTC m=+0.315760331 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 04:56:40 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:56:40 localhost nova_compute[281952]: 2025-11-23 09:56:40.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:56:41 localhost podman[240668]: time="2025-11-23T09:56:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:56:41 localhost podman[240668]: @ - - [23/Nov/2025:09:56:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:56:41 localhost podman[240668]: @ - - [23/Nov/2025:09:56:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18724 "" "Go-http-client/1.1" Nov 23 04:56:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:42.109 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:56:42 localhost nova_compute[281952]: 2025-11-23 09:56:42.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:42.111 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:56:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:56:42.112 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:56:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:45 localhost nova_compute[281952]: 2025-11-23 09:56:45.502 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e88 e88: 6 total, 6 up, 6 in Nov 23 04:56:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:56:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:56:49 localhost systemd[1]: tmp-crun.wLgEMb.mount: Deactivated successfully. Nov 23 04:56:49 localhost podman[309298]: 2025-11-23 09:56:49.08188603 +0000 UTC m=+0.136024753 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:56:49 localhost podman[309297]: 2025-11-23 09:56:49.041754725 +0000 UTC m=+0.099618772 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:56:49 localhost podman[309297]: 2025-11-23 09:56:49.125290526 +0000 UTC m=+0.183154523 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:56:49 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:56:49 localhost podman[309298]: 2025-11-23 09:56:49.144645697 +0000 UTC m=+0.198784430 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 04:56:49 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:56:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 e89: 6 total, 6 up, 6 in Nov 23 04:56:50 localhost nova_compute[281952]: 2025-11-23 09:56:50.504 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:52 localhost sshd[309339]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:56:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.531 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:56:55 localhost nova_compute[281952]: 2025-11-23 09:56:55.531 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:56:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:56:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:56:59 localhost podman[309341]: 2025-11-23 09:56:59.020632853 +0000 UTC m=+0.073171905 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:56:59 localhost podman[309341]: 2025-11-23 09:56:59.03428898 +0000 UTC m=+0.086828042 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute) Nov 23 04:56:59 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:56:59 localhost openstack_network_exporter[242668]: ERROR 09:56:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:56:59 localhost openstack_network_exporter[242668]: ERROR 09:56:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:56:59 localhost openstack_network_exporter[242668]: ERROR 09:56:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:56:59 localhost openstack_network_exporter[242668]: ERROR 09:56:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:56:59 localhost openstack_network_exporter[242668]: Nov 23 04:56:59 localhost openstack_network_exporter[242668]: ERROR 09:56:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:56:59 localhost openstack_network_exporter[242668]: Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.532 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.536 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.580 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:00 localhost nova_compute[281952]: 2025-11-23 09:57:00.581 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:57:01 localhost podman[309360]: 2025-11-23 09:57:01.019398024 +0000 UTC m=+0.077177798 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:57:01 localhost podman[309360]: 2025-11-23 09:57:01.027447169 +0000 UTC m=+0.085227023 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:57:01 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:57:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.582 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.584 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.613 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:05 localhost nova_compute[281952]: 2025-11-23 09:57:05.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:07 localhost sshd[309383]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:57:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:08 localhost sshd[309385]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:57:09 localhost nova_compute[281952]: 2025-11-23 09:57:09.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:09 localhost nova_compute[281952]: 2025-11-23 09:57:09.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:09.297 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:57:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:09.297 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:57:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.616 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.626 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.627 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.647 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:10 localhost nova_compute[281952]: 2025-11-23 09:57:10.648 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.807 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.808 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.823 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.823 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78d63edc-1c58-4001-830c-f978fcd88d02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.808821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c77d46a8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '478447ddc229dde13972dc31cf545b502ca1742c4ef214562bc59ebab5209259'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.808821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c77d54f4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '8a796f0b8c2490823cc24cbf2a5f9cbd6eac1380c1be37d20f15b2dabcfe8579'}]}, 'timestamp': '2025-11-23 09:57:10.824029', '_unique_id': 'd5bf6fa087a1454a94b107d09bc7531b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf04b37-8c67-4263-9471-dd0f8267658c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:57:10.825988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c781caa2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.030258566, 'message_signature': '5fc41ba593677bfa0a4f9300132e9c3ab566e460f240e4c67d6a7c637685d3de'}]}, 'timestamp': '2025-11-23 09:57:10.853411', '_unique_id': '629405b2cf724f2abb25f40888cae624'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.854 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.855 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.859 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06b2c575-8499-4fe8-a143-f93255350343', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.855981', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c782d9d8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '9df1f0832b638b2d0818a84ec3aa386a735f8f9b572a822b1525a4ff5d3e0742'}]}, 'timestamp': '2025-11-23 09:57:10.860386', '_unique_id': 'c963bdceddaf47769c677e8253926aca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.861 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.863 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7a22a3-bb15-4987-83dc-11f582233ef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.863341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c7884a1c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '9fe84e68b0b3d056d64a5d9b366910cf3d1d17b6b03a7d830cc5dcfd20b26d9a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.863341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7885ff2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'e703e90dba7a22496910fbdfe2bdc94cd650ca548608e0dbdaca5f348879a57c'}]}, 'timestamp': '2025-11-23 09:57:10.896456', '_unique_id': 'e094cfeb5b014c069e44a806daf653ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50d9880f-85d6-46af-a329-141b0915c56e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.899311', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c788e1f2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '58a59fb6ea04ec031ae38657c2093f8667f46bf11d6302bce319a0534751093a'}]}, 'timestamp': '2025-11-23 09:57:10.899825', '_unique_id': 'fb3e77bb6f904d8694d10e1439c59437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.900 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '465116ef-d445-4ece-95b5-b7e71c3d695e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.902997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78971d0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '0b2f12f6ceda787aa02a51658d6118a282694d1dc8d1e2e5022218d2e2b5caf2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.902997', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78982b0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '4af8e016df5967d42b70dbcb3bcb5ec25dea58f58e90baa1cade008828db1198'}]}, 'timestamp': '2025-11-23 09:57:10.903874', '_unique_id': 'b2b35dc62ba34a37874ba177128d3cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc46533b-b1c9-4390-842c-6601c0d1acc3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.906190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c789ee12-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '7209c62dd00fee7384ed4294e4628a83b79cd150eda829e9c991d5ba3b237585'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.906190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c789fe0c-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': 'd5e542e81b0add35d04967d45e0e010ecffa5a72d270bb8e6ea6d40196c3f01a'}]}, 'timestamp': '2025-11-23 09:57:10.907064', '_unique_id': 'e76408ef1e4646cf9b4bfc82674f3f7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aca6e18-13da-4893-b385-76e5276051d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.909294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78a6af4-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '71d68b1f00d9f78bdc31144ef957cdd5156df74a85311e19821a84d3fc8cd365'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.909294', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78a7c42-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'b91e091c3b0fe22aad96b34e4a56119079a21ae85f862bb21e76c35ef95bdac1'}]}, 'timestamp': '2025-11-23 09:57:10.910265', '_unique_id': '5faa8f88ea12415cbef0a14b966d8d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfa3095f-02b4-4eae-a2e0-ea1594942321', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.912427', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78ae164-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '716b0b29318cd6d3c155bd47a51d066a6738229b3b10627c83b1eb86c7d5c962'}]}, 'timestamp': '2025-11-23 09:57:10.912878', '_unique_id': '263bf88113544c61a6a5801e5abb5d6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2ef9b6a-bd7f-43e0-9b85-38f219580e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.915152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78b4bae-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '55c69475479b2b1f8157062a31bf12c96f06853e982b7f561c633ed915ec6dd2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.915152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78b5cc0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '5103aca7f11099788e3812a80e95826c71f9167fed48c101b05553c60b081fc0'}]}, 'timestamp': '2025-11-23 09:57:10.916038', '_unique_id': 'a7777385ea0a41f49d6bfc23d7c89dd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc26168d-4dc0-4ef1-8598-a2107d28f5a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.918188', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78bc2a0-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '380ff1fe396935985b3e2d9c310fff282c35185e7e3429791711c39974c6c9b1'}]}, 'timestamp': '2025-11-23 09:57:10.918644', '_unique_id': '8d8a1dba7d344296b159c5e8bfa74df5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5b12f18-e3aa-4925-8020-7b32a1182a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.921561', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78c46da-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '352e0492c91fc0aa55ff15b50e55b51060445ec4fa11977d54a8f1ea55b1b873'}]}, 'timestamp': '2025-11-23 09:57:10.922070', '_unique_id': 'e1f2cd504dbd43a88f9f5e271e08d08e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35e3634-b034-4603-a2e1-fd5bcb4e64fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.925061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78cd1c2-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '49df87751615c99b36ec08b5b045345c06a37eed20ec7e1aafe77dfe521707b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.925061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78ce946-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11773.986474439, 'message_signature': '0ddb2e5cdca43862ed99c41645e30c463dd73dcd4c360029f2cd2bda91e13767'}]}, 'timestamp': '2025-11-23 09:57:10.926248', '_unique_id': '11d4bdaa20af4124b1e167cbf2e5fd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fde602e-894d-48d0-bc66-f14477135d97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.929404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c78d7b22-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'febff36c543627473c9d9142647482e449005cb3fdfcfbfb4d39961e01b81094'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.929404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c78d9472-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '7b8aa71da2b8e875865329d4f48f9145b0523500e7d7d3ede2674808a69ba7cd'}]}, 'timestamp': '2025-11-23 09:57:10.930621', '_unique_id': 'd852f3572b754bb7a78dded20966081e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.933 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.933 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 15320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afd9dc55-7579-48fa-97c5-5d668e0fa794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15320000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:57:10.933653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c78e227a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.030258566, 'message_signature': '2faeb5c5b1dbc380e718da489722546adfca69de7e4dca56247db747e2262a48'}]}, 'timestamp': '2025-11-23 09:57:10.934284', '_unique_id': 'c6c15e0edf0440b784591de26760661c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6e3f2a8-0a1c-4c96-9316-89f0840523a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.937287', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78eb140-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '83b7371bb87d4d35e538010f44e156244190a3591eddf40f9cd1ee9018dcb8d0'}]}, 'timestamp': '2025-11-23 09:57:10.938030', '_unique_id': '799ee58a731049c2a450220ceeb9ac0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f202f32e-74b8-4d8d-b4ae-b9aa32534262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.940724', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78f352a-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': 'd08e78602ed722288b53f40b7cd23e74a71f3dcf12ae395d5489e5c65dec165c'}]}, 'timestamp': '2025-11-23 09:57:10.941257', '_unique_id': 'edcad9c6d26a4b1aaeaf9c378c878926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.942 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.943 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00cc8a95-27d4-4968-90e9-a77518574d99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.943512', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c78fa3e8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '82feda838720645bf9419ffd7ac450cf6cbed16797b98d3c469fb80fa854cab8'}]}, 'timestamp': '2025-11-23 09:57:10.944150', '_unique_id': 'b1f06faf3f014047bdc10d445abef5ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.945 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.946 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5035b8bc-234b-442e-b005-7c67697f93e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.946881', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c79027c8-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': 'fd396743e3f7156405109fb499216b793852b8b2ed94d1e6779d6c0efddcb9c1'}]}, 'timestamp': '2025-11-23 09:57:10.947475', '_unique_id': '38b9643ca2834623921487fdde980363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.948 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.949 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '572107a5-0cb5-4d8f-95ea-82c8d93d4e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:57:10.949673', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'c790915e-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.033640209, 'message_signature': '3e4212aa9f11cee3c63b3b537aff35a535547be38b3f12d08935fc15c3a354ab'}]}, 'timestamp': '2025-11-23 09:57:10.950155', '_unique_id': '3380ed6a4f1649f783a80847ef9b03a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.951 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.952 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ad56411-1f6b-4a34-bea8-354e24daa917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:57:10.952473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c790fd60-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': 'd7b4d62f11b0731c573fc8d33cabd8a8e2371bb2d812a33d58f1d4885b57e4f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:57:10.952473', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c7910e18-c852-11f0-bde4-fa163e72a351', 'monotonic_time': 11774.040994394, 'message_signature': '9ee4a3df95ede19cede8bdb042951ef4f76b7adebc0c3fdd9abe621a63da26cf'}]}, 'timestamp': '2025-11-23 09:57:10.953317', '_unique_id': '9cd8f808b56743faaf97eaaefb92ce23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:57:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:57:10.954 12 ERROR oslo_messaging.notify.messaging Nov 23 04:57:11 localhost podman[309388]: 2025-11-23 09:57:11.037854789 +0000 UTC m=+0.090402821 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:57:11 localhost podman[309387]: 2025-11-23 09:57:11.071263238 +0000 UTC m=+0.122916173 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:57:11 localhost nova_compute[281952]: 2025-11-23 09:57:11.098 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:57:11 localhost podman[309387]: 2025-11-23 09:57:11.107320839 +0000 UTC m=+0.158973844 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller) Nov 23 04:57:11 localhost nova_compute[281952]: 2025-11-23 09:57:11.113 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:57:11 localhost nova_compute[281952]: 2025-11-23 09:57:11.114 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:57:11 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:57:11 localhost podman[309388]: 2025-11-23 09:57:11.122294136 +0000 UTC m=+0.174842158 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:57:11 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:57:11 localhost podman[309389]: 2025-11-23 09:57:11.194017766 +0000 UTC m=+0.236252734 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350) Nov 23 04:57:11 localhost podman[309389]: 2025-11-23 09:57:11.235316466 +0000 UTC m=+0.277551434 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 04:57:11 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:57:11 localhost podman[240668]: time="2025-11-23T09:57:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:57:11 localhost podman[240668]: @ - - [23/Nov/2025:09:57:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 04:57:11 localhost podman[240668]: @ - - [23/Nov/2025:09:57:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18718 "" "Go-http-client/1.1" Nov 23 04:57:12 localhost nova_compute[281952]: 2025-11-23 09:57:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:12 localhost nova_compute[281952]: 2025-11-23 09:57:12.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:12 localhost nova_compute[281952]: 2025-11-23 09:57:12.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:57:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:14 localhost nova_compute[281952]: 2025-11-23 09:57:14.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:14 localhost ovn_controller[154788]: 2025-11-23T09:57:14Z|00070|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.234 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.679 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:15 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:57:15 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2601443712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.744 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.800 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:57:15 localhost nova_compute[281952]: 2025-11-23 09:57:15.801 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.020 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.022 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11681MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.022 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.023 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.103 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.104 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.104 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.152 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:57:16 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:57:16 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/349663793' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.613 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.620 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.633 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.636 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:57:16 localhost nova_compute[281952]: 2025-11-23 09:57:16.636 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:57:17 localhost nova_compute[281952]: 2025-11-23 09:57:17.637 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:17 localhost nova_compute[281952]: 2025-11-23 09:57:17.657 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:57:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:57:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:57:20 localhost podman[309491]: 2025-11-23 09:57:20.030745915 +0000 UTC m=+0.083744188 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:57:20 localhost podman[309491]: 2025-11-23 09:57:20.046532967 +0000 UTC m=+0.099531270 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:57:20 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:57:20 localhost podman[309492]: 2025-11-23 09:57:20.134137832 +0000 UTC m=+0.185072171 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:57:20 localhost podman[309492]: 2025-11-23 09:57:20.146655563 +0000 UTC m=+0.197589932 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:57:20 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.683 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:20 localhost nova_compute[281952]: 2025-11-23 09:57:20.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.721 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.741 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:25 localhost nova_compute[281952]: 2025-11-23 09:57:25.742 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:57:29 localhost openstack_network_exporter[242668]: ERROR 09:57:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:57:29 localhost openstack_network_exporter[242668]: ERROR 09:57:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:57:29 localhost openstack_network_exporter[242668]: ERROR 09:57:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:57:29 localhost openstack_network_exporter[242668]: ERROR 09:57:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:57:29 localhost openstack_network_exporter[242668]: Nov 23 04:57:29 localhost openstack_network_exporter[242668]: ERROR 09:57:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:57:29 localhost openstack_network_exporter[242668]: Nov 23 04:57:30 localhost podman[309533]: 2025-11-23 09:57:30.03900065 +0000 UTC m=+0.091157354 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 04:57:30 localhost podman[309533]: 2025-11-23 09:57:30.047777858 +0000 UTC m=+0.099934562 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:57:30 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.262 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmph68f1hsn/privsep.sock']#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.743 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.768 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.769 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.769 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:30 localhost nova_compute[281952]: 2025-11-23 09:57:30.771 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.929 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.814 309556 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.819 309556 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.822 309556 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 23 04:57:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:30.823 309556 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309556#033[00m Nov 23 04:57:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:31.483 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpymplsf7c/privsep.sock']#033[00m Nov 23 04:57:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:57:32 localhost systemd[1]: tmp-crun.4Z4zoy.mount: Deactivated successfully. Nov 23 04:57:32 localhost podman[309565]: 2025-11-23 09:57:32.098065432 +0000 UTC m=+0.153515848 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:57:32 localhost podman[309565]: 2025-11-23 09:57:32.13634492 +0000 UTC m=+0.191795346 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:57:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.145 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:57:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.034 309593 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:57:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.039 309593 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:57:32 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:57:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.044 309593 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 23 04:57:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:32.044 309593 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309593#033[00m Nov 23 04:57:32 localhost nova_compute[281952]: 2025-11-23 09:57:32.455 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.103 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpafrwc5cs/privsep.sock']#033[00m Nov 23 04:57:33 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:57:33 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.725 263258 INFO oslo.privsep.daemon [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.624 309685 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.629 309685 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.632 309685 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 23 04:57:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:33.632 309685 INFO oslo.privsep.daemon [-] privsep daemon running as pid 309685#033[00m Nov 23 04:57:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:57:34 localhost nova_compute[281952]: 2025-11-23 09:57:34.687 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:35.044 263258 INFO neutron.agent.linux.ip_lib [None req-beecc1a6-fe5b-4fe1-bf5d-5910019c6ec4 - - - - - -] Device tap73b68089-2f cannot be used as it has no MAC address#033[00m Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.116 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost kernel: device tap73b68089-2f entered promiscuous mode Nov 23 04:57:35 localhost NetworkManager[5975]: [1763891855.1292] manager: (tap73b68089-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Nov 23 04:57:35 localhost ovn_controller[154788]: 2025-11-23T09:57:35Z|00071|binding|INFO|Claiming lport 73b68089-2f56-49cd-9bd3-002979f43843 for this chassis. Nov 23 04:57:35 localhost ovn_controller[154788]: 2025-11-23T09:57:35Z|00072|binding|INFO|73b68089-2f56-49cd-9bd3-002979f43843: Claiming unknown Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.130 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost systemd-udevd[309700]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:57:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:35.152 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab89783c9d39468096f7d3a0c6bf4d3e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13bd6782-9124-462d-b4a7-d10c24537f9d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=73b68089-2f56-49cd-9bd3-002979f43843) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:35.155 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 73b68089-2f56-49cd-9bd3-002979f43843 in datapath 7c71c90d-a08f-42e1-bb2f-ef1175c4042b bound to our chassis#033[00m Nov 23 04:57:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:35.159 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port c84e6e46-efbc-4de0-9598-13a85365118b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:57:35 localhost journal[230249]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 23 04:57:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:35.159 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:57:35 localhost journal[230249]: hostname: np0005532585.localdomain Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:35.162 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f07e157f-b07a-4e3f-981e-1c3c3f4c89ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost ovn_controller[154788]: 2025-11-23T09:57:35Z|00073|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 ovn-installed in OVS Nov 23 04:57:35 localhost ovn_controller[154788]: 2025-11-23T09:57:35Z|00074|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 up in Southbound Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost journal[230249]: ethtool ioctl error on tap73b68089-2f: No such device Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.237 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:35 localhost nova_compute[281952]: 2025-11-23 09:57:35.772 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:36 localhost podman[309772]: Nov 23 04:57:36 localhost podman[309772]: 2025-11-23 09:57:36.12725213 +0000 UTC m=+0.088870135 container create 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 04:57:36 localhost systemd[1]: Started libpod-conmon-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope. Nov 23 04:57:36 localhost podman[309772]: 2025-11-23 09:57:36.084471454 +0000 UTC m=+0.046089479 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:57:36 localhost systemd[1]: Started libcrun container. Nov 23 04:57:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edacf18bbfc488c9c82ec3cf149606e2b511b3a1b82fab7a350e4145bfcc787/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:57:36 localhost podman[309772]: 2025-11-23 09:57:36.203153577 +0000 UTC m=+0.164771572 container init 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 04:57:36 localhost podman[309772]: 2025-11-23 09:57:36.214229235 +0000 UTC m=+0.175847230 container start 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:57:36 localhost dnsmasq[309791]: started, version 2.85 cachesize 150 Nov 23 04:57:36 localhost dnsmasq[309791]: DNS service limited to local subnets Nov 23 04:57:36 localhost dnsmasq[309791]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:57:36 localhost dnsmasq[309791]: warning: no upstream servers configured Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:57:36 localhost dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 0 addresses Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts Nov 23 04:57:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.275 263258 INFO neutron.agent.dhcp.agent [None req-b79e4a4d-e53c-4c59-a4c3-1778eb6e2957 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:33Z, description=, device_id=af12546c-762e-460a-92cf-0a6f5f6b8733, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5, ip_allocation=immediate, mac_address=fa:16:3e:45:fe:c5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:28Z, description=, dns_domain=, id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-1806334605-network, port_security_enabled=True, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59428, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=222, status=ACTIVE, subnets=['12d58085-977d-4709-b642-b2646c597567'], tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:29Z, vlan_transparent=None, network_id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, port_security_enabled=False, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=269, status=DOWN, tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:33Z on network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b#033[00m Nov 23 04:57:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.380 263258 INFO neutron.agent.dhcp.agent [None req-dea51365-5277-467a-a0e4-8f2cff221a55 - - - - - -] DHCP configuration for ports {'351c24ed-9c6f-4f66-8dd6-3859a3d2d148'} is completed#033[00m Nov 23 04:57:36 localhost dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 1 addresses Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts Nov 23 04:57:36 localhost podman[309809]: 2025-11-23 09:57:36.55931361 +0000 UTC m=+0.060553169 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 04:57:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.686 263258 INFO neutron.agent.dhcp.agent [None req-0d7905f9-ef19-4367-b6f0-737b6233be16 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:33Z, description=, device_id=af12546c-762e-460a-92cf-0a6f5f6b8733, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5, ip_allocation=immediate, mac_address=fa:16:3e:45:fe:c5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:28Z, description=, dns_domain=, id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-1806334605-network, port_security_enabled=True, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59428, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=222, status=ACTIVE, subnets=['12d58085-977d-4709-b642-b2646c597567'], tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:29Z, vlan_transparent=None, network_id=7c71c90d-a08f-42e1-bb2f-ef1175c4042b, port_security_enabled=False, project_id=ab89783c9d39468096f7d3a0c6bf4d3e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=269, status=DOWN, tags=[], tenant_id=ab89783c9d39468096f7d3a0c6bf4d3e, updated_at=2025-11-23T09:57:33Z on network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b#033[00m Nov 23 04:57:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:36.831 263258 INFO neutron.agent.dhcp.agent [None req-181149f1-12c6-4e81-9849-e6d22dbcf9be - - - - - -] DHCP configuration for ports {'f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5'} is completed#033[00m Nov 23 04:57:36 localhost dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 1 addresses Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host Nov 23 04:57:36 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts Nov 23 04:57:36 localhost podman[309846]: 2025-11-23 09:57:36.899024181 +0000 UTC m=+0.058800726 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:57:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:37.354 263258 INFO neutron.agent.dhcp.agent [None req-5136d212-a09e-49a4-80b5-c64eee44d855 - - - - - -] DHCP configuration for ports {'f3dc5cf7-1676-4ccc-8bb7-dd7188c88bd5'} is completed#033[00m Nov 23 04:57:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:38 localhost nova_compute[281952]: 2025-11-23 09:57:38.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:40.470 263258 INFO neutron.agent.linux.ip_lib [None req-30fdac74-ec64-4a2d-8e04-cdf0bde538d3 - - - - - -] Device tap55b3fe2f-21 cannot be used as it has no MAC address#033[00m Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost kernel: device tap55b3fe2f-21 entered promiscuous mode Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.501 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost NetworkManager[5975]: [1763891860.5022] manager: (tap55b3fe2f-21): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Nov 23 04:57:40 localhost systemd-udevd[309878]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:57:40 localhost ovn_controller[154788]: 2025-11-23T09:57:40Z|00075|binding|INFO|Claiming lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 for this chassis. Nov 23 04:57:40 localhost ovn_controller[154788]: 2025-11-23T09:57:40Z|00076|binding|INFO|55b3fe2f-21ef-4379-99f5-112f0dfef914: Claiming unknown Nov 23 04:57:40 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:40.514 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a6dad5f4ea934cb2b33cba44987053be', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9d6fde6-b084-4385-8650-ae0140b56791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=55b3fe2f-21ef-4379-99f5-112f0dfef914) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:40 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:40.516 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 55b3fe2f-21ef-4379-99f5-112f0dfef914 in datapath 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d bound to our chassis#033[00m Nov 23 04:57:40 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:40.517 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:57:40 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:40.519 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e27a7ceb-2cfc-4704-bc41-916ab5c4d36e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.526 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost ovn_controller[154788]: 2025-11-23T09:57:40Z|00077|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 ovn-installed in OVS Nov 23 04:57:40 localhost ovn_controller[154788]: 2025-11-23T09:57:40Z|00078|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 up in Southbound Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.533 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost journal[230249]: ethtool ioctl error on tap55b3fe2f-21: No such device Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.562 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.588 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:40 localhost nova_compute[281952]: 2025-11-23 09:57:40.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:57:40 localhost systemd[1]: tmp-crun.d70Jkr.mount: Deactivated successfully. Nov 23 04:57:40 localhost dnsmasq[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/addn_hosts - 0 addresses Nov 23 04:57:40 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/host Nov 23 04:57:40 localhost dnsmasq-dhcp[309791]: read /var/lib/neutron/dhcp/7c71c90d-a08f-42e1-bb2f-ef1175c4042b/opts Nov 23 04:57:40 localhost podman[309930]: 2025-11-23 09:57:40.818658196 +0000 UTC m=+0.068877664 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:57:41 localhost ovn_controller[154788]: 2025-11-23T09:57:41Z|00079|binding|INFO|Releasing lport 73b68089-2f56-49cd-9bd3-002979f43843 from this chassis (sb_readonly=0) Nov 23 04:57:41 localhost ovn_controller[154788]: 2025-11-23T09:57:41Z|00080|binding|INFO|Setting lport 73b68089-2f56-49cd-9bd3-002979f43843 down in Southbound Nov 23 04:57:41 localhost nova_compute[281952]: 2025-11-23 09:57:41.031 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:41 localhost kernel: device tap73b68089-2f left promiscuous mode Nov 23 04:57:41 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:41.044 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c71c90d-a08f-42e1-bb2f-ef1175c4042b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab89783c9d39468096f7d3a0c6bf4d3e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=13bd6782-9124-462d-b4a7-d10c24537f9d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=73b68089-2f56-49cd-9bd3-002979f43843) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:41 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:41.046 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 73b68089-2f56-49cd-9bd3-002979f43843 in datapath 7c71c90d-a08f-42e1-bb2f-ef1175c4042b unbound from our chassis#033[00m Nov 23 04:57:41 localhost nova_compute[281952]: 2025-11-23 09:57:41.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:41 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:41.049 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:57:41 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:41.049 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba680c6-ad2a-42bb-acbf-82c6fba0c1c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:57:41 localhost podman[309991]: Nov 23 04:57:41 localhost podman[309991]: 2025-11-23 09:57:41.456043935 +0000 UTC m=+0.091721562 container create 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:57:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:57:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:57:41 localhost systemd[1]: Started libpod-conmon-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope. Nov 23 04:57:41 localhost podman[309991]: 2025-11-23 09:57:41.413365592 +0000 UTC m=+0.049043239 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:57:41 localhost systemd[1]: Started libcrun container. Nov 23 04:57:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a72aa34fa3703c0feb66b78aa70334c94e6bb60891737ec5625b5eed8d5bdbf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:57:41 localhost podman[309991]: 2025-11-23 09:57:41.535547742 +0000 UTC m=+0.171225389 container init 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:57:41 localhost podman[309991]: 2025-11-23 09:57:41.550127717 +0000 UTC m=+0.185805334 container start 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:41 localhost dnsmasq[310035]: started, version 2.85 cachesize 150 Nov 23 04:57:41 localhost dnsmasq[310035]: DNS service limited to local subnets Nov 23 04:57:41 localhost dnsmasq[310035]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:57:41 localhost dnsmasq[310035]: warning: no upstream servers configured Nov 23 04:57:41 localhost dnsmasq-dhcp[310035]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:57:41 localhost dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 0 addresses Nov 23 04:57:41 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host Nov 23 04:57:41 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts Nov 23 04:57:41 localhost podman[310005]: 2025-11-23 09:57:41.589875911 +0000 UTC m=+0.086098870 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_controller) Nov 23 04:57:41 localhost podman[310006]: 2025-11-23 09:57:41.603334232 +0000 UTC m=+0.096375834 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 23 04:57:41 localhost podman[310006]: 2025-11-23 09:57:41.63573223 +0000 UTC m=+0.128773842 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 23 04:57:41 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:57:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:41.670 263258 INFO neutron.agent.dhcp.agent [None req-d8ff0c38-fe32-49e6-9103-fcca295fbcf4 - - - - - -] DHCP configuration for ports {'3b646975-045a-40d5-9ebc-93b350fd7125'} is completed#033[00m Nov 23 04:57:41 localhost podman[310005]: 2025-11-23 09:57:41.671851983 +0000 UTC m=+0.168074962 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:57:41 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:57:41 localhost podman[310007]: 2025-11-23 09:57:41.676356381 +0000 UTC m=+0.166976149 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 04:57:41 localhost podman[310007]: 2025-11-23 09:57:41.759681435 +0000 UTC m=+0.250301243 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter) Nov 23 04:57:41 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:57:41 localhost podman[240668]: time="2025-11-23T09:57:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:57:41 localhost podman[240668]: @ - - [23/Nov/2025:09:57:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157510 "" "Go-http-client/1.1" Nov 23 04:57:41 localhost podman[240668]: @ - - [23/Nov/2025:09:57:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19691 "" "Go-http-client/1.1" Nov 23 04:57:42 localhost systemd[1]: tmp-crun.FHqc4H.mount: Deactivated successfully. Nov 23 04:57:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:42.773 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:42.775 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:57:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:42.776 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:57:42 localhost nova_compute[281952]: 2025-11-23 09:57:42.812 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:43 localhost ovn_controller[154788]: 2025-11-23T09:57:43Z|00081|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:57:43 localhost nova_compute[281952]: 2025-11-23 09:57:43.673 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:44 localhost dnsmasq[309791]: exiting on receipt of SIGTERM Nov 23 04:57:44 localhost podman[310088]: 2025-11-23 09:57:44.202520933 +0000 UTC m=+0.061353044 container kill 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:57:44 localhost systemd[1]: libpod-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope: Deactivated successfully. Nov 23 04:57:44 localhost podman[310102]: 2025-11-23 09:57:44.271927361 +0000 UTC m=+0.057692392 container died 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101-userdata-shm.mount: Deactivated successfully. Nov 23 04:57:44 localhost podman[310102]: 2025-11-23 09:57:44.307125846 +0000 UTC m=+0.092890807 container cleanup 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:57:44 localhost systemd[1]: libpod-conmon-74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101.scope: Deactivated successfully. Nov 23 04:57:44 localhost podman[310109]: 2025-11-23 09:57:44.363547358 +0000 UTC m=+0.134814466 container remove 74b1cf6af1b0fa7be7e853d1d48eec2e2e9138da7f26b7f07b338339b4103101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7c71c90d-a08f-42e1-bb2f-ef1175c4042b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 04:57:44 localhost nova_compute[281952]: 2025-11-23 09:57:44.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:44.442 263258 INFO neutron.agent.dhcp.agent [None req-12a9e5c0-6ff2-4ada-a33a-fd5958a68959 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:57:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:44.443 263258 INFO neutron.agent.dhcp.agent [None req-12a9e5c0-6ff2-4ada-a33a-fd5958a68959 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:57:45 localhost systemd[1]: var-lib-containers-storage-overlay-8edacf18bbfc488c9c82ec3cf149606e2b511b3a1b82fab7a350e4145bfcc787-merged.mount: Deactivated successfully. Nov 23 04:57:45 localhost systemd[1]: run-netns-qdhcp\x2d7c71c90d\x2da08f\x2d42e1\x2dbb2f\x2def1175c4042b.mount: Deactivated successfully. Nov 23 04:57:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:45.714 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:45Z, description=, device_id=0f8464a3-f81d-4757-a6d6-f5dc7dd25ed6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cc4bfc60-3e56-4b71-a76c-62f3aabf67a4, ip_allocation=immediate, mac_address=fa:16:3e:c7:62:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:37Z, description=, dns_domain=, id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-328332667-network, port_security_enabled=True, project_id=a6dad5f4ea934cb2b33cba44987053be, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33842, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=312, status=ACTIVE, subnets=['f8967804-12ce-482f-8265-4d6194cb2186'], tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:39Z, vlan_transparent=None, network_id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, port_security_enabled=False, project_id=a6dad5f4ea934cb2b33cba44987053be, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=352, status=DOWN, tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:45Z on network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d#033[00m Nov 23 04:57:45 localhost nova_compute[281952]: 2025-11-23 09:57:45.815 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:45 localhost podman[310149]: 2025-11-23 09:57:45.923998468 +0000 UTC m=+0.060038904 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 04:57:45 localhost dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 1 addresses Nov 23 04:57:45 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host Nov 23 04:57:45 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts Nov 23 04:57:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:46.146 263258 INFO neutron.agent.dhcp.agent [None req-c8a9f48c-e4cc-4380-8d0a-cc660448b4ce - - - - - -] DHCP configuration for ports {'cc4bfc60-3e56-4b71-a76c-62f3aabf67a4'} is completed#033[00m Nov 23 04:57:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:46.690 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:45Z, description=, device_id=0f8464a3-f81d-4757-a6d6-f5dc7dd25ed6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cc4bfc60-3e56-4b71-a76c-62f3aabf67a4, ip_allocation=immediate, mac_address=fa:16:3e:c7:62:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:37Z, description=, dns_domain=, id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-328332667-network, port_security_enabled=True, project_id=a6dad5f4ea934cb2b33cba44987053be, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33842, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=312, status=ACTIVE, subnets=['f8967804-12ce-482f-8265-4d6194cb2186'], tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:39Z, vlan_transparent=None, network_id=81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, port_security_enabled=False, project_id=a6dad5f4ea934cb2b33cba44987053be, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=352, status=DOWN, tags=[], tenant_id=a6dad5f4ea934cb2b33cba44987053be, updated_at=2025-11-23T09:57:45Z on network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d#033[00m Nov 23 04:57:46 localhost dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 1 addresses Nov 23 04:57:46 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host Nov 23 04:57:46 localhost podman[310186]: 2025-11-23 09:57:46.899333364 +0000 UTC m=+0.061355774 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:57:46 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts Nov 23 04:57:46 localhost nova_compute[281952]: 2025-11-23 09:57:46.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:47 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:47.268 263258 INFO neutron.agent.dhcp.agent [None req-59729e6a-680e-4ea1-b310-aa07945498c6 - - - - - -] DHCP configuration for ports {'cc4bfc60-3e56-4b71-a76c-62f3aabf67a4'} is completed#033[00m Nov 23 04:57:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:50 localhost nova_compute[281952]: 2025-11-23 09:57:50.819 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:57:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:57:51 localhost podman[310207]: 2025-11-23 09:57:51.034860488 +0000 UTC m=+0.087340247 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:57:51 localhost podman[310207]: 2025-11-23 09:57:51.042537813 +0000 UTC m=+0.095017602 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:57:51 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:57:51 localhost podman[310206]: 2025-11-23 09:57:51.128717104 +0000 UTC m=+0.184105442 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 04:57:51 localhost podman[310206]: 2025-11-23 09:57:51.1443069 +0000 UTC m=+0.199695258 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:57:51 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:57:51 localhost nova_compute[281952]: 2025-11-23 09:57:51.665 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:51 localhost dnsmasq[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/addn_hosts - 0 addresses Nov 23 04:57:51 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/host Nov 23 04:57:51 localhost podman[310265]: 2025-11-23 09:57:51.976784505 +0000 UTC m=+0.049546534 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 04:57:51 localhost dnsmasq-dhcp[310035]: read /var/lib/neutron/dhcp/81b87fa4-03b0-4cc9-9d5d-0d532d38f99d/opts Nov 23 04:57:52 localhost nova_compute[281952]: 2025-11-23 09:57:52.123 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:52 localhost ovn_controller[154788]: 2025-11-23T09:57:52Z|00082|binding|INFO|Releasing lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 from this chassis (sb_readonly=0) Nov 23 04:57:52 localhost kernel: device tap55b3fe2f-21 left promiscuous mode Nov 23 04:57:52 localhost ovn_controller[154788]: 2025-11-23T09:57:52Z|00083|binding|INFO|Setting lport 55b3fe2f-21ef-4379-99f5-112f0dfef914 down in Southbound Nov 23 04:57:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:52.133 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a6dad5f4ea934cb2b33cba44987053be', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9d6fde6-b084-4385-8650-ae0140b56791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=55b3fe2f-21ef-4379-99f5-112f0dfef914) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:52.134 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 55b3fe2f-21ef-4379-99f5-112f0dfef914 in datapath 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d unbound from our chassis#033[00m Nov 23 04:57:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:52.137 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:57:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:52.139 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[13165d96-6e54-4d0b-b8ab-f5c270ed1f75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:57:52 localhost nova_compute[281952]: 2025-11-23 09:57:52.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:52 localhost nova_compute[281952]: 2025-11-23 09:57:52.143 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:52 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:54 localhost neutron_sriov_agent[256124]: 2025-11-23 09:57:54.044 2 INFO neutron.agent.securitygroups_rpc [None req-c244d218-e6b7-4260-9702-7bb508c7ef68 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']#033[00m Nov 23 04:57:54 localhost ovn_controller[154788]: 2025-11-23T09:57:54Z|00084|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:57:54 localhost nova_compute[281952]: 2025-11-23 09:57:54.830 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e90 e90: 6 total, 6 up, 6 in Nov 23 04:57:55 localhost dnsmasq[310035]: exiting on receipt of SIGTERM Nov 23 04:57:55 localhost podman[310305]: 2025-11-23 09:57:55.704080336 +0000 UTC m=+0.059619061 container kill 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 04:57:55 localhost systemd[1]: libpod-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope: Deactivated successfully. Nov 23 04:57:55 localhost podman[310318]: 2025-11-23 09:57:55.764851852 +0000 UTC m=+0.046315676 container died 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296-userdata-shm.mount: Deactivated successfully. Nov 23 04:57:55 localhost podman[310318]: 2025-11-23 09:57:55.801724427 +0000 UTC m=+0.083188201 container cleanup 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 04:57:55 localhost systemd[1]: libpod-conmon-6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296.scope: Deactivated successfully. Nov 23 04:57:55 localhost nova_compute[281952]: 2025-11-23 09:57:55.861 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:55 localhost podman[310319]: 2025-11-23 09:57:55.872550329 +0000 UTC m=+0.147677998 container remove 6514ad87a46159adeb3e52648a9fc865a08bff1950f459667d6797cb30358296 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81b87fa4-03b0-4cc9-9d5d-0d532d38f99d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:57:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:55.913 263258 INFO neutron.agent.dhcp.agent [None req-3dd77c87-1459-47cc-a2cf-f3d9dc34dd0b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:57:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:56.224 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:57:56 localhost systemd[1]: var-lib-containers-storage-overlay-1a72aa34fa3703c0feb66b78aa70334c94e6bb60891737ec5625b5eed8d5bdbf-merged.mount: Deactivated successfully. Nov 23 04:57:56 localhost systemd[1]: run-netns-qdhcp\x2d81b87fa4\x2d03b0\x2d4cc9\x2d9d5d\x2d0d532d38f99d.mount: Deactivated successfully. Nov 23 04:57:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:56.899 263258 INFO neutron.agent.linux.ip_lib [None req-97a4fc18-8858-4838-86d1-e88b9f78dbd9 - - - - - -] Device tap8c439e83-e9 cannot be used as it has no MAC address#033[00m Nov 23 04:57:56 localhost nova_compute[281952]: 2025-11-23 09:57:56.957 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:56 localhost kernel: device tap8c439e83-e9 entered promiscuous mode Nov 23 04:57:56 localhost NetworkManager[5975]: [1763891876.9652] manager: (tap8c439e83-e9): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Nov 23 04:57:56 localhost ovn_controller[154788]: 2025-11-23T09:57:56Z|00085|binding|INFO|Claiming lport 8c439e83-e972-4e99-8d01-ff5269427a3c for this chassis. Nov 23 04:57:56 localhost ovn_controller[154788]: 2025-11-23T09:57:56Z|00086|binding|INFO|8c439e83-e972-4e99-8d01-ff5269427a3c: Claiming unknown Nov 23 04:57:56 localhost nova_compute[281952]: 2025-11-23 09:57:56.966 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:56 localhost systemd-udevd[310357]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:57:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:56.981 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2148c18d8f24a6db12dc22c787e8b2e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e3b2035-d1e3-4dc9-824d-c8c5d8c83090, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c439e83-e972-4e99-8d01-ff5269427a3c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:57:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:56.983 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8c439e83-e972-4e99-8d01-ff5269427a3c in datapath 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 bound to our chassis#033[00m Nov 23 04:57:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:56.984 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:57:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:57:56.985 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4fb42e-106e-4b13-a2fb-3da42b3adf25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost ovn_controller[154788]: 2025-11-23T09:57:57Z|00087|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c ovn-installed in OVS Nov 23 04:57:57 localhost ovn_controller[154788]: 2025-11-23T09:57:57Z|00088|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c up in Southbound Nov 23 04:57:57 localhost nova_compute[281952]: 2025-11-23 09:57:57.006 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost journal[230249]: ethtool ioctl error on tap8c439e83-e9: No such device Nov 23 04:57:57 localhost nova_compute[281952]: 2025-11-23 09:57:57.042 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:57 localhost nova_compute[281952]: 2025-11-23 09:57:57.068 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:57:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:57:57 localhost podman[310428]: Nov 23 04:57:57 localhost podman[310428]: 2025-11-23 09:57:57.920496041 +0000 UTC m=+0.091383080 container create ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 04:57:57 localhost systemd[1]: Started libpod-conmon-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope. Nov 23 04:57:57 localhost systemd[1]: Started libcrun container. Nov 23 04:57:57 localhost podman[310428]: 2025-11-23 09:57:57.875823328 +0000 UTC m=+0.046710387 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:57:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/067ea74ee71831860e0346c8368bf66a295b3b34ee14dc8629622033be014e0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:57:57 localhost podman[310428]: 2025-11-23 09:57:57.986635911 +0000 UTC m=+0.157522950 container init ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:57 localhost podman[310428]: 2025-11-23 09:57:57.997477742 +0000 UTC m=+0.168364771 container start ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:57:58 localhost dnsmasq[310446]: started, version 2.85 cachesize 150 Nov 23 04:57:58 localhost dnsmasq[310446]: DNS service limited to local subnets Nov 23 04:57:58 localhost dnsmasq[310446]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:57:58 localhost dnsmasq[310446]: warning: no upstream servers configured Nov 23 04:57:58 localhost dnsmasq-dhcp[310446]: DHCP, static leases only on 19.80.0.0, lease time 1d Nov 23 04:57:58 localhost dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 0 addresses Nov 23 04:57:58 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host Nov 23 04:57:58 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts Nov 23 04:57:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:58.332 263258 INFO neutron.agent.dhcp.agent [None req-aa114eb8-c645-45ca-aee4-97d1e9730f4f - - - - - -] DHCP configuration for ports {'6df03061-a46e-4f2d-b42f-4f149f759e31'} is completed#033[00m Nov 23 04:57:58 localhost neutron_sriov_agent[256124]: 2025-11-23 09:57:58.480 2 INFO neutron.agent.securitygroups_rpc [None req-d68d9a06-c0e6-4bcf-9e27-a376d467ec2a 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']#033[00m Nov 23 04:57:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:58.509 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:57:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b779be61-5809-44a6-8395-bfdf8254b4cc, ip_allocation=immediate, mac_address=fa:16:3e:e3:5d:7d, name=tempest-subport-711090127, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:54Z, description=, dns_domain=, id=8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-881047405, port_security_enabled=True, project_id=a2148c18d8f24a6db12dc22c787e8b2e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7641, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=410, status=ACTIVE, subnets=['62dffd83-97b3-49ae-a870-a9bc062f1cbb'], tags=[], tenant_id=a2148c18d8f24a6db12dc22c787e8b2e, updated_at=2025-11-23T09:57:55Z, vlan_transparent=None, network_id=8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, port_security_enabled=True, project_id=a2148c18d8f24a6db12dc22c787e8b2e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ff44a28d-1e1f-4163-b206-fdf77022bf0b'], standard_attr_id=430, status=DOWN, tags=[], tenant_id=a2148c18d8f24a6db12dc22c787e8b2e, updated_at=2025-11-23T09:57:58Z on network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3#033[00m Nov 23 04:57:58 localhost dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 1 addresses Nov 23 04:57:58 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host Nov 23 04:57:58 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts Nov 23 04:57:58 localhost podman[310462]: 2025-11-23 09:57:58.727038535 +0000 UTC m=+0.057615000 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:57:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:57:59.239 263258 INFO neutron.agent.dhcp.agent [None req-f336ca04-ed79-4114-8df0-7a40d4e366db - - - - - -] DHCP configuration for ports {'b779be61-5809-44a6-8395-bfdf8254b4cc'} is completed#033[00m Nov 23 04:57:59 localhost openstack_network_exporter[242668]: ERROR 09:57:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:57:59 localhost openstack_network_exporter[242668]: ERROR 09:57:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:57:59 localhost openstack_network_exporter[242668]: ERROR 09:57:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:57:59 localhost openstack_network_exporter[242668]: ERROR 09:57:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:57:59 localhost openstack_network_exporter[242668]: Nov 23 04:57:59 localhost openstack_network_exporter[242668]: ERROR 09:57:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:57:59 localhost openstack_network_exporter[242668]: Nov 23 04:58:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e91 e91: 6 total, 6 up, 6 in Nov 23 04:58:00 localhost nova_compute[281952]: 2025-11-23 09:58:00.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:58:01 localhost podman[310483]: 2025-11-23 09:58:01.032989284 +0000 UTC m=+0.084250014 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:01 localhost podman[310483]: 2025-11-23 09:58:01.070086657 +0000 UTC m=+0.121347417 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:58:01 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:58:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:01.902 263258 INFO neutron.agent.linux.ip_lib [None req-5c65c8ed-2305-4d00-9b34-36b1bd8ce7dd - - - - - -] Device tapb3d2d8f1-5b cannot be used as it has no MAC address#033[00m Nov 23 04:58:01 localhost nova_compute[281952]: 2025-11-23 09:58:01.924 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:01 localhost kernel: device tapb3d2d8f1-5b entered promiscuous mode Nov 23 04:58:01 localhost nova_compute[281952]: 2025-11-23 09:58:01.931 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:01 localhost ovn_controller[154788]: 2025-11-23T09:58:01Z|00089|binding|INFO|Claiming lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 for this chassis. Nov 23 04:58:01 localhost NetworkManager[5975]: [1763891881.9320] manager: (tapb3d2d8f1-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/21) Nov 23 04:58:01 localhost ovn_controller[154788]: 2025-11-23T09:58:01Z|00090|binding|INFO|b3d2d8f1-5bd4-4472-8663-88ab24ce0d37: Claiming unknown Nov 23 04:58:01 localhost systemd-udevd[310512]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:58:01 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:01.945 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '103398a293414a3081333eb24455a6bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f945a9c-d59a-418c-831c-76be9f4ae46a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3d2d8f1-5bd4-4472-8663-88ab24ce0d37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:01 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:01.947 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 in datapath dbe92c68-f688-4288-aaf4-6edc728d68bf bound to our chassis#033[00m Nov 23 04:58:01 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:01.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6c6697f5-de38-4217-b0e1-a5abd4eb4fe0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:58:01 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:01.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbe92c68-f688-4288-aaf4-6edc728d68bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:01 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:01.954 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[37615a6f-3668-4a79-9e2b-72ba9e322bba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost ovn_controller[154788]: 2025-11-23T09:58:01Z|00091|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 ovn-installed in OVS Nov 23 04:58:01 localhost ovn_controller[154788]: 2025-11-23T09:58:01Z|00092|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 up in Southbound Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost nova_compute[281952]: 2025-11-23 09:58:01.970 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:01 localhost journal[230249]: ethtool ioctl error on tapb3d2d8f1-5b: No such device Nov 23 04:58:02 localhost nova_compute[281952]: 2025-11-23 09:58:02.007 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:02 localhost nova_compute[281952]: 2025-11-23 09:58:02.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:58:02 localhost podman[310546]: 2025-11-23 09:58:02.295326652 +0000 UTC m=+0.088163852 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:58:02 localhost podman[310546]: 2025-11-23 09:58:02.333392594 +0000 UTC m=+0.126229784 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:58:02 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:58:02 localhost podman[310605]: Nov 23 04:58:02 localhost podman[310605]: 2025-11-23 09:58:02.857326369 +0000 UTC m=+0.090473983 container create 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:58:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:02 localhost systemd[1]: Started libpod-conmon-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope. Nov 23 04:58:02 localhost podman[310605]: 2025-11-23 09:58:02.81083064 +0000 UTC m=+0.043978284 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:58:02 localhost systemd[1]: Started libcrun container. Nov 23 04:58:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95375b4c449ae49e7d86df09c61dc1d8f60e640130c105c535ef71f0e1c26dfc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:58:02 localhost podman[310605]: 2025-11-23 09:58:02.935057352 +0000 UTC m=+0.168204956 container init 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:02 localhost podman[310605]: 2025-11-23 09:58:02.944288894 +0000 UTC m=+0.177436518 container start 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 04:58:02 localhost dnsmasq[310624]: started, version 2.85 cachesize 150 Nov 23 04:58:02 localhost dnsmasq[310624]: DNS service limited to local subnets Nov 23 04:58:02 localhost dnsmasq[310624]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:58:02 localhost dnsmasq[310624]: warning: no upstream servers configured Nov 23 04:58:02 localhost dnsmasq-dhcp[310624]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:58:02 localhost dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 0 addresses Nov 23 04:58:02 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host Nov 23 04:58:02 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts Nov 23 04:58:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:03.153 263258 INFO neutron.agent.dhcp.agent [None req-d6a567b3-94f9-4b2f-8762-e28c321912fc - - - - - -] DHCP configuration for ports {'aba6d273-b423-4b6b-b3f0-7ecf19405434'} is completed#033[00m Nov 23 04:58:03 localhost nova_compute[281952]: 2025-11-23 09:58:03.155 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:04.987 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:04Z, description=, device_id=caa365d3-aa93-4c7c-a692-c3fee4872fc2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9d830597-f26f-4a70-b1b8-39d71caf458e, ip_allocation=immediate, mac_address=fa:16:3e:89:8e:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:59Z, description=, dns_domain=, id=dbe92c68-f688-4288-aaf4-6edc728d68bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1552785166-network, port_security_enabled=True, project_id=103398a293414a3081333eb24455a6bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20423, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=447, status=ACTIVE, subnets=['c372ea8d-cfdf-4713-bee5-0b10a9ac63ab'], tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:00Z, vlan_transparent=None, network_id=dbe92c68-f688-4288-aaf4-6edc728d68bf, port_security_enabled=False, project_id=103398a293414a3081333eb24455a6bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:04Z on network dbe92c68-f688-4288-aaf4-6edc728d68bf#033[00m Nov 23 04:58:05 localhost dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 1 addresses Nov 23 04:58:05 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host Nov 23 04:58:05 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts Nov 23 04:58:05 localhost podman[310642]: 2025-11-23 09:58:05.185170336 +0000 UTC m=+0.062902782 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:58:05 localhost nova_compute[281952]: 2025-11-23 09:58:05.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:05.499 263258 INFO neutron.agent.dhcp.agent [None req-74be8323-216f-4023-b121-5a43ec778f23 - - - - - -] DHCP configuration for ports {'9d830597-f26f-4a70-b1b8-39d71caf458e'} is completed#033[00m Nov 23 04:58:05 localhost nova_compute[281952]: 2025-11-23 09:58:05.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:06.519 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:04Z, description=, device_id=caa365d3-aa93-4c7c-a692-c3fee4872fc2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9d830597-f26f-4a70-b1b8-39d71caf458e, ip_allocation=immediate, mac_address=fa:16:3e:89:8e:29, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:57:59Z, description=, dns_domain=, id=dbe92c68-f688-4288-aaf4-6edc728d68bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1552785166-network, port_security_enabled=True, project_id=103398a293414a3081333eb24455a6bd, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20423, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=447, status=ACTIVE, subnets=['c372ea8d-cfdf-4713-bee5-0b10a9ac63ab'], tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:00Z, vlan_transparent=None, network_id=dbe92c68-f688-4288-aaf4-6edc728d68bf, port_security_enabled=False, project_id=103398a293414a3081333eb24455a6bd, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=492, status=DOWN, tags=[], tenant_id=103398a293414a3081333eb24455a6bd, updated_at=2025-11-23T09:58:04Z on network dbe92c68-f688-4288-aaf4-6edc728d68bf#033[00m Nov 23 04:58:06 localhost dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 1 addresses Nov 23 04:58:06 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host Nov 23 04:58:06 localhost podman[310681]: 2025-11-23 09:58:06.726000097 +0000 UTC m=+0.062973804 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:58:06 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts Nov 23 04:58:06 localhost nova_compute[281952]: 2025-11-23 09:58:06.759 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:06.997 263258 INFO neutron.agent.dhcp.agent [None req-bfd1472c-9df8-4506-bb99-77d295303bc5 - - - - - -] DHCP configuration for ports {'9d830597-f26f-4a70-b1b8-39d71caf458e'} is completed#033[00m Nov 23 04:58:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 e92: 6 total, 6 up, 6 in Nov 23 04:58:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:09 localhost nova_compute[281952]: 2025-11-23 09:58:09.233 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:09 localhost nova_compute[281952]: 2025-11-23 09:58:09.234 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.319 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:58:10 localhost nova_compute[281952]: 2025-11-23 09:58:10.320 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:58:10 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:10.587 2 INFO neutron.agent.securitygroups_rpc [req-3532c496-51d7-40c7-b3da-c0e7be1692a4 req-d87dead5-02e8-46e7-bd25-42d652af07f6 b79ba98acc3c4b3580a3847feb119c9b 103398a293414a3081333eb24455a6bd - - default default] Security group rule updated ['280efa91-c004-412c-b87a-91a6eef9493c']#033[00m Nov 23 04:58:10 localhost sshd[310701]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:58:11 localhost nova_compute[281952]: 2025-11-23 09:58:11.001 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:58:11 localhost nova_compute[281952]: 2025-11-23 09:58:11.011 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:11 localhost nova_compute[281952]: 2025-11-23 09:58:11.017 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:58:11 localhost nova_compute[281952]: 2025-11-23 09:58:11.017 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:58:11 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:11.082 2 INFO neutron.agent.securitygroups_rpc [req-ce34b552-7369-4724-9e28-4e57bb3059bd req-8a77dff8-1df4-4326-b30f-4088438850bd b79ba98acc3c4b3580a3847feb119c9b 103398a293414a3081333eb24455a6bd - - default default] Security group rule updated ['280efa91-c004-412c-b87a-91a6eef9493c']#033[00m Nov 23 04:58:11 localhost nova_compute[281952]: 2025-11-23 09:58:11.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:11 localhost podman[240668]: time="2025-11-23T09:58:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:58:11 localhost podman[240668]: @ - - [23/Nov/2025:09:58:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157508 "" "Go-http-client/1.1" Nov 23 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:58:11 localhost podman[240668]: @ - - [23/Nov/2025:09:58:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19686 "" "Go-http-client/1.1" Nov 23 04:58:12 localhost systemd[1]: tmp-crun.nJuNvQ.mount: Deactivated successfully. Nov 23 04:58:12 localhost podman[310703]: 2025-11-23 09:58:12.088725948 +0000 UTC m=+0.140606223 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:58:12 localhost podman[310705]: 2025-11-23 09:58:12.099547759 +0000 UTC m=+0.151594059 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, vcs-type=git) Nov 23 04:58:12 localhost podman[310705]: 2025-11-23 09:58:12.143346315 +0000 UTC m=+0.195392575 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter) Nov 23 04:58:12 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:58:12 localhost podman[310703]: 2025-11-23 09:58:12.170457443 +0000 UTC m=+0.222337708 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:58:12 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:58:12 localhost podman[310704]: 2025-11-23 09:58:12.149321418 +0000 UTC m=+0.202459432 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 23 04:58:12 localhost nova_compute[281952]: 2025-11-23 09:58:12.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:12 localhost nova_compute[281952]: 2025-11-23 09:58:12.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:58:12 localhost podman[310704]: 2025-11-23 09:58:12.230240858 +0000 UTC m=+0.283378892 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 04:58:12 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:58:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:13.421 263258 INFO neutron.agent.linux.ip_lib [None req-d54484ed-6ef1-4ccc-9b87-1b8f2a65c516 - - - - - -] Device tapd6f3b7ff-1b cannot be used as it has no MAC address#033[00m Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost kernel: device tapd6f3b7ff-1b entered promiscuous mode Nov 23 04:58:13 localhost ovn_controller[154788]: 2025-11-23T09:58:13Z|00093|binding|INFO|Claiming lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 for this chassis. Nov 23 04:58:13 localhost ovn_controller[154788]: 2025-11-23T09:58:13Z|00094|binding|INFO|d6f3b7ff-1bfe-4568-bcbd-2732186dba70: Claiming unknown Nov 23 04:58:13 localhost NetworkManager[5975]: [1763891893.5052] manager: (tapd6f3b7ff-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/22) Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost systemd-udevd[310775]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.517 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:13.515 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0497de4959b2494e8036eb39226430d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54e00d1b-ba48-40e5-8228-7e38f918fa79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d6f3b7ff-1bfe-4568-bcbd-2732186dba70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:13.516 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d6f3b7ff-1bfe-4568-bcbd-2732186dba70 in datapath c5d88dfa-0db8-489e-a45a-e843e31a3b26 bound to our chassis#033[00m Nov 23 04:58:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:13.518 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port ebd9d8fb-8dbf-4955-903c-af75d19c361c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:58:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:13.518 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5d88dfa-0db8-489e-a45a-e843e31a3b26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:13 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:13.519 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b450dd71-ba31-44a6-a8f3-014f03e6d96c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.533 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost ovn_controller[154788]: 2025-11-23T09:58:13Z|00095|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 ovn-installed in OVS Nov 23 04:58:13 localhost ovn_controller[154788]: 2025-11-23T09:58:13Z|00096|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 up in Southbound Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.540 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost journal[230249]: ethtool ioctl error on tapd6f3b7ff-1b: No such device Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.589 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:13 localhost nova_compute[281952]: 2025-11-23 09:58:13.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:14 localhost nova_compute[281952]: 2025-11-23 09:58:14.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:14 localhost nova_compute[281952]: 2025-11-23 09:58:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:14 localhost podman[310846]: Nov 23 04:58:14 localhost podman[310846]: 2025-11-23 09:58:14.523241812 +0000 UTC m=+0.097531609 container create 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:58:14 localhost podman[310846]: 2025-11-23 09:58:14.476000989 +0000 UTC m=+0.050290876 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:58:14 localhost systemd[1]: Started libpod-conmon-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope. Nov 23 04:58:14 localhost systemd[1]: tmp-crun.vWVlD8.mount: Deactivated successfully. Nov 23 04:58:14 localhost systemd[1]: Started libcrun container. Nov 23 04:58:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e36441351384cf4ce607d42d9d2a693f2618154ca741337c33c1071de55a0ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:58:14 localhost podman[310846]: 2025-11-23 09:58:14.618987174 +0000 UTC m=+0.193276971 container init 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:58:14 localhost podman[310846]: 2025-11-23 09:58:14.627909877 +0000 UTC m=+0.202199674 container start 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:58:14 localhost dnsmasq[310864]: started, version 2.85 cachesize 150 Nov 23 04:58:14 localhost dnsmasq[310864]: DNS service limited to local subnets Nov 23 04:58:14 localhost dnsmasq[310864]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:58:14 localhost dnsmasq[310864]: warning: no upstream servers configured Nov 23 04:58:14 localhost dnsmasq-dhcp[310864]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:58:14 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 0 addresses Nov 23 04:58:14 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:58:14 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:58:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:14.785 263258 INFO neutron.agent.dhcp.agent [None req-96f8d984-f7ca-4ade-ab26-f0f0f3ddceed - - - - - -] DHCP configuration for ports {'a8a61203-fe2e-4005-bcf2-6150709eadea'} is completed#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:15 localhost systemd[1]: tmp-crun.2d6Teh.mount: Deactivated successfully. Nov 23 04:58:15 localhost dnsmasq[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/addn_hosts - 0 addresses Nov 23 04:58:15 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/host Nov 23 04:58:15 localhost dnsmasq-dhcp[310624]: read /var/lib/neutron/dhcp/dbe92c68-f688-4288-aaf4-6edc728d68bf/opts Nov 23 04:58:15 localhost podman[310899]: 2025-11-23 09:58:15.557472046 +0000 UTC m=+0.065540752 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:15 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:58:15 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1789582883' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.742 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:15 localhost ovn_controller[154788]: 2025-11-23T09:58:15Z|00097|binding|INFO|Releasing lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 from this chassis (sb_readonly=0) Nov 23 04:58:15 localhost ovn_controller[154788]: 2025-11-23T09:58:15Z|00098|binding|INFO|Setting lport b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 down in Southbound Nov 23 04:58:15 localhost kernel: device tapb3d2d8f1-5b left promiscuous mode Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:15.792 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dbe92c68-f688-4288-aaf4-6edc728d68bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '103398a293414a3081333eb24455a6bd', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f945a9c-d59a-418c-831c-76be9f4ae46a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3d2d8f1-5bd4-4472-8663-88ab24ce0d37) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:15.793 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b3d2d8f1-5bd4-4472-8663-88ab24ce0d37 in datapath dbe92c68-f688-4288-aaf4-6edc728d68bf unbound from our chassis#033[00m Nov 23 04:58:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:15.795 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dbe92c68-f688-4288-aaf4-6edc728d68bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:15.799 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ba65c446-ac5d-4822-803f-31fb7c072ee4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:15.858 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:15Z, description=, device_id=489975b8-b64c-4318-bb24-a798d93046de, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e, ip_allocation=immediate, mac_address=fa:16:3e:fb:9f:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=False, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=586, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:15Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.878 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:58:15 localhost nova_compute[281952]: 2025-11-23 09:58:15.879 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.054 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.055 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11375MB free_disk=41.774322509765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.056 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.056 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:16 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses Nov 23 04:58:16 localhost systemd[1]: tmp-crun.OItiOf.mount: Deactivated successfully. Nov 23 04:58:16 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:58:16 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:58:16 localhost podman[310941]: 2025-11-23 09:58:16.087932551 +0000 UTC m=+0.071883956 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.277 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.278 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.279 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:58:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:16.350 263258 INFO neutron.agent.dhcp.agent [None req-955c50bf-858a-4c6e-af2e-94778d49fba6 - - - - - -] DHCP configuration for ports {'a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e'} is completed#033[00m Nov 23 04:58:16 localhost nova_compute[281952]: 2025-11-23 09:58:16.609 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:58:17 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3516973189' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.053 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.060 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.077 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.095 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.096 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 04:58:17 localhost nova_compute[281952]: 2025-11-23 09:58:17.230 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 04:58:18 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:17.859 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:15Z, description=, device_id=489975b8-b64c-4318-bb24-a798d93046de, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e, ip_allocation=immediate, mac_address=fa:16:3e:fb:9f:ce, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=False, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=586, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:15Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26#033[00m Nov 23 04:58:18 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses Nov 23 04:58:18 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:58:18 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:58:18 localhost podman[310999]: 2025-11-23 09:58:18.228940704 +0000 UTC m=+0.200285706 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:58:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:18 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:18.397 263258 INFO neutron.agent.dhcp.agent [None req-8185762f-af93-4602-9a04-39972f389d15 - - - - - -] DHCP configuration for ports {'a8e35ae6-dcac-4b38-bbda-9adbf81f0b7e'} is completed#033[00m Nov 23 04:58:18 localhost ovn_controller[154788]: 2025-11-23T09:58:18Z|00099|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:58:18 localhost nova_compute[281952]: 2025-11-23 09:58:18.724 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:19 localhost nova_compute[281952]: 2025-11-23 09:58:19.193 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:19 localhost dnsmasq[310624]: exiting on receipt of SIGTERM Nov 23 04:58:19 localhost systemd[1]: tmp-crun.msWSXe.mount: Deactivated successfully. Nov 23 04:58:19 localhost systemd[1]: libpod-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope: Deactivated successfully. Nov 23 04:58:19 localhost podman[311037]: 2025-11-23 09:58:19.45943014 +0000 UTC m=+0.059337112 container kill 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 04:58:19 localhost podman[311051]: 2025-11-23 09:58:19.52692046 +0000 UTC m=+0.054380310 container died 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:58:19 localhost podman[311051]: 2025-11-23 09:58:19.561525036 +0000 UTC m=+0.088984846 container cleanup 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 04:58:19 localhost systemd[1]: libpod-conmon-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d.scope: Deactivated successfully. Nov 23 04:58:19 localhost podman[311056]: 2025-11-23 09:58:19.612105831 +0000 UTC m=+0.129762563 container remove 0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-dbe92c68-f688-4288-aaf4-6edc728d68bf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:58:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:19.958 263258 INFO neutron.agent.dhcp.agent [None req-13acc26b-0b5a-47d5-91ff-55db475188c1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:58:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:19.959 263258 INFO neutron.agent.dhcp.agent [None req-13acc26b-0b5a-47d5-91ff-55db475188c1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:58:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:20.332 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:58:20 localhost systemd[1]: var-lib-containers-storage-overlay-95375b4c449ae49e7d86df09c61dc1d8f60e640130c105c535ef71f0e1c26dfc-merged.mount: Deactivated successfully. Nov 23 04:58:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cd345d8e0cfd9a77b7fec03017b1c712878f5cc0774c41f19f47d5386de1f4d-userdata-shm.mount: Deactivated successfully. Nov 23 04:58:20 localhost systemd[1]: run-netns-qdhcp\x2ddbe92c68\x2df688\x2d4288\x2daaf4\x2d6edc728d68bf.mount: Deactivated successfully. Nov 23 04:58:21 localhost nova_compute[281952]: 2025-11-23 09:58:21.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:21 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:21.562 2 INFO neutron.agent.securitygroups_rpc [None req-513d9ac5-08dd-4555-997f-809230181da7 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group rule updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']#033[00m Nov 23 04:58:21 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:21.713 2 INFO neutron.agent.securitygroups_rpc [None req-7741ab62-6798-4d09-b205-555af43d015d 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group rule updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']#033[00m Nov 23 04:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:58:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:58:22 localhost podman[311083]: 2025-11-23 09:58:22.041929333 +0000 UTC m=+0.094243009 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:58:22 localhost podman[311083]: 2025-11-23 09:58:22.081251533 +0000 UTC m=+0.133565239 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:58:22 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:58:22 localhost podman[311082]: 2025-11-23 09:58:22.136557222 +0000 UTC m=+0.191367924 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 23 04:58:22 localhost podman[311082]: 2025-11-23 09:58:22.178476722 +0000 UTC m=+0.233287454 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:22 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:58:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:23 localhost nova_compute[281952]: 2025-11-23 09:58:23.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:25 localhost nova_compute[281952]: 2025-11-23 09:58:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:58:25 localhost nova_compute[281952]: 2025-11-23 09:58:25.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 04:58:25 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:25.331 2 INFO neutron.agent.securitygroups_rpc [None req-ec9f2257-2897-484b-a0ca-c8a73a80ef4d 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']#033[00m Nov 23 04:58:26 localhost nova_compute[281952]: 2025-11-23 09:58:26.019 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:26 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:26.565 2 INFO neutron.agent.securitygroups_rpc [req-dc62ce45-8668-47e6-9d5e-2f0b1764537e req-34d4dcd5-73f6-46e0-ba5e-aabbd18e768e 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group member updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']#033[00m Nov 23 04:58:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:26.636 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:26Z, description=, device_id=1148b5a9-4da9-491f-8952-80c4a965fe6b, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a1846659-6b91-4156-9939-085b30454143, ip_allocation=immediate, mac_address=fa:16:3e:da:90:40, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:11Z, description=, dns_domain=, id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1246892017-network, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8214, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=532, status=ACTIVE, subnets=['c6867cf5-ed63-4cec-9755-38eaead8ab16'], tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:11Z, vlan_transparent=None, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['2da1104f-77c5-475e-b21f-e52710edc8b5'], standard_attr_id=648, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:26Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26#033[00m Nov 23 04:58:26 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 2 addresses Nov 23 04:58:26 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:58:26 localhost podman[311139]: 2025-11-23 09:58:26.849632959 +0000 UTC m=+0.060220469 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:58:26 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:58:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.143 263258 INFO neutron.agent.dhcp.agent [None req-ac2ce650-6b75-4265-9722-02c2ce37a04c - - - - - -] DHCP configuration for ports {'a1846659-6b91-4156-9939-085b30454143'} is completed#033[00m Nov 23 04:58:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.625 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005532584.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:26Z, description=, device_id=1148b5a9-4da9-491f-8952-80c4a965fe6b, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[], id=a1846659-6b91-4156-9939-085b30454143, ip_allocation=immediate, mac_address=fa:16:3e:da:90:40, name=, network_id=c5d88dfa-0db8-489e-a45a-e843e31a3b26, port_security_enabled=True, project_id=0497de4959b2494e8036eb39226430d6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['2da1104f-77c5-475e-b21f-e52710edc8b5'], standard_attr_id=648, status=DOWN, tags=[], tenant_id=0497de4959b2494e8036eb39226430d6, updated_at=2025-11-23T09:58:27Z on network c5d88dfa-0db8-489e-a45a-e843e31a3b26#033[00m Nov 23 04:58:27 localhost podman[311177]: 2025-11-23 09:58:27.761940332 +0000 UTC m=+0.029073140 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 04:58:27 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 2 addresses Nov 23 04:58:27 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:58:27 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:58:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:27.973 263258 INFO neutron.agent.dhcp.agent [None req-68f7a82a-282a-4552-839c-b6eeea7989c5 - - - - - -] DHCP configuration for ports {'a1846659-6b91-4156-9939-085b30454143'} is completed#033[00m Nov 23 04:58:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:28.188 263258 INFO neutron.agent.linux.ip_lib [None req-69d1f747-8ab7-41e5-a631-855da9356272 - - - - - -] Device tapca98d0dd-23 cannot be used as it has no MAC address#033[00m Nov 23 04:58:28 localhost nova_compute[281952]: 2025-11-23 09:58:28.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:28 localhost kernel: device tapca98d0dd-23 entered promiscuous mode Nov 23 04:58:28 localhost NetworkManager[5975]: [1763891908.2182] manager: (tapca98d0dd-23): new Generic device (/org/freedesktop/NetworkManager/Devices/23) Nov 23 04:58:28 localhost nova_compute[281952]: 2025-11-23 09:58:28.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:28 localhost ovn_controller[154788]: 2025-11-23T09:58:28Z|00100|binding|INFO|Claiming lport ca98d0dd-231a-46c7-80b8-a48c00a5696e for this chassis. Nov 23 04:58:28 localhost ovn_controller[154788]: 2025-11-23T09:58:28Z|00101|binding|INFO|ca98d0dd-231a-46c7-80b8-a48c00a5696e: Claiming unknown Nov 23 04:58:28 localhost systemd-udevd[311209]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:58:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:28 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:28.235 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-903951dd-448c-4453-aa24-f24a53269074', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca98d0dd-231a-46c7-80b8-a48c00a5696e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:28 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:28.238 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ca98d0dd-231a-46c7-80b8-a48c00a5696e in datapath 903951dd-448c-4453-aa24-f24a53269074 bound to our chassis#033[00m Nov 23 04:58:28 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:28.240 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 903951dd-448c-4453-aa24-f24a53269074 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:58:28 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:28.241 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdec47d-0554-49bc-a003-b46997fef3d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost ovn_controller[154788]: 2025-11-23T09:58:28Z|00102|binding|INFO|Setting lport ca98d0dd-231a-46c7-80b8-a48c00a5696e ovn-installed in OVS Nov 23 04:58:28 localhost ovn_controller[154788]: 2025-11-23T09:58:28Z|00103|binding|INFO|Setting lport ca98d0dd-231a-46c7-80b8-a48c00a5696e up in Southbound Nov 23 04:58:28 localhost nova_compute[281952]: 2025-11-23 09:58:28.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost journal[230249]: ethtool ioctl error on tapca98d0dd-23: No such device Nov 23 04:58:28 localhost nova_compute[281952]: 2025-11-23 09:58:28.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:28 localhost nova_compute[281952]: 2025-11-23 09:58:28.356 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:29 localhost podman[311280]: Nov 23 04:58:29 localhost podman[311280]: 2025-11-23 09:58:29.246077082 +0000 UTC m=+0.088000969 container create d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:29 localhost systemd[1]: Started libpod-conmon-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope. Nov 23 04:58:29 localhost systemd[1]: Started libcrun container. Nov 23 04:58:29 localhost podman[311280]: 2025-11-23 09:58:29.204137101 +0000 UTC m=+0.046061028 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:58:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50b27a1dbab67616f24ed7868948ddd418eee50493d781bdc989237aff316f07/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:58:29 localhost podman[311280]: 2025-11-23 09:58:29.318087109 +0000 UTC m=+0.160010996 container init d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:58:29 localhost podman[311280]: 2025-11-23 09:58:29.328171568 +0000 UTC m=+0.170095465 container start d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:58:29 localhost dnsmasq[311299]: started, version 2.85 cachesize 150 Nov 23 04:58:29 localhost dnsmasq[311299]: DNS service limited to local subnets Nov 23 04:58:29 localhost dnsmasq[311299]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:58:29 localhost dnsmasq[311299]: warning: no upstream servers configured Nov 23 04:58:29 localhost dnsmasq-dhcp[311299]: DHCP, static leases only on 19.80.0.0, lease time 1d Nov 23 04:58:29 localhost dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 0 addresses Nov 23 04:58:29 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host Nov 23 04:58:29 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts Nov 23 04:58:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:29.470 263258 INFO neutron.agent.dhcp.agent [None req-27a4d57d-c3ab-4387-a248-7e72d81e4b29 - - - - - -] DHCP configuration for ports {'b83bb60d-d579-4f8d-9e2c-3885d238bb26'} is completed#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.624 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.625 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.645 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.738 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.739 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.745 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.745 281956 INFO nova.compute.claims [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Claim successful on node np0005532585.localdomain#033[00m Nov 23 04:58:29 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:29.796 2 INFO neutron.agent.securitygroups_rpc [None req-2d14bee2-d335-42d8-9b8c-ccacbe55654b 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']#033[00m Nov 23 04:58:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:29.827 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:58:29Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fd30dda9-c731-47dd-b319-ebcca717b708, ip_allocation=immediate, mac_address=fa:16:3e:4f:95:ad, name=tempest-subport-1587702031, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:25Z, description=, dns_domain=, id=903951dd-448c-4453-aa24-f24a53269074, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-497326985, port_security_enabled=True, project_id=253c88568a634476a6c1284eed6a9464, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35137, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=646, status=ACTIVE, subnets=['2620a714-202e-4ac3-ab2a-226dc050a669'], tags=[], tenant_id=253c88568a634476a6c1284eed6a9464, updated_at=2025-11-23T09:58:27Z, vlan_transparent=None, network_id=903951dd-448c-4453-aa24-f24a53269074, port_security_enabled=True, project_id=253c88568a634476a6c1284eed6a9464, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a3350144-9b09-432b-a32e-ef84bb8bf494'], standard_attr_id=652, status=DOWN, tags=[], tenant_id=253c88568a634476a6c1284eed6a9464, updated_at=2025-11-23T09:58:29Z on network 903951dd-448c-4453-aa24-f24a53269074#033[00m Nov 23 04:58:29 localhost nova_compute[281952]: 2025-11-23 09:58:29.882 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:29 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 23 04:58:29 localhost openstack_network_exporter[242668]: ERROR 09:58:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:58:29 localhost openstack_network_exporter[242668]: ERROR 09:58:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:58:29 localhost openstack_network_exporter[242668]: ERROR 09:58:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:58:29 localhost openstack_network_exporter[242668]: ERROR 09:58:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:58:29 localhost openstack_network_exporter[242668]: Nov 23 04:58:29 localhost openstack_network_exporter[242668]: ERROR 09:58:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:58:29 localhost openstack_network_exporter[242668]: Nov 23 04:58:30 localhost dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 1 addresses Nov 23 04:58:30 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host Nov 23 04:58:30 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts Nov 23 04:58:30 localhost podman[311320]: 2025-11-23 09:58:30.097978769 +0000 UTC m=+0.057924419 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:30.331 263258 INFO neutron.agent.dhcp.agent [None req-62ce58fd-b234-459e-b227-fbd75f57ba4f - - - - - -] DHCP configuration for ports {'fd30dda9-c731-47dd-b319-ebcca717b708'} is completed#033[00m Nov 23 04:58:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:58:30 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3630223276' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.400 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.407 281956 DEBUG nova.compute.provider_tree [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.423 281956 DEBUG nova.scheduler.client.report [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.458 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.459 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.520 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.535 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.552 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.629 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.632 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.632 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating image(s)#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.673 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.717 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.757 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.761 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.762 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:30 localhost nova_compute[281952]: 2025-11-23 09:58:30.800 281956 DEBUG nova.virt.libvirt.imagebackend [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image locations are: [{'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/c5806483-57a8-4254-b41b-254b888c8606/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/c5806483-57a8-4254-b41b-254b888c8606/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 23 04:58:30 localhost sshd[311417]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.056 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:58:31 localhost podman[311419]: 2025-11-23 09:58:31.353001994 +0000 UTC m=+0.091881406 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:58:31 localhost podman[311419]: 2025-11-23 09:58:31.372359926 +0000 UTC m=+0.111239318 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:58:31 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.777 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.851 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.852 281956 DEBUG nova.virt.images [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] c5806483-57a8-4254-b41b-254b888c8606 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.854 281956 DEBUG nova.privsep.utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 23 04:58:31 localhost nova_compute[281952]: 2025-11-23 09:58:31.855 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.097 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.part /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.102 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.176 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.177 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "ba971d9ef74673015953b46ad4dbea47e54dd66a" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.209 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.215 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.829 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/ba971d9ef74673015953b46ad4dbea47e54dd66a 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.614s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:58:32 localhost nova_compute[281952]: 2025-11-23 09:58:32.942 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] resizing rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m Nov 23 04:58:33 localhost podman[311526]: 2025-11-23 09:58:33.025781213 +0000 UTC m=+0.084677316 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 04:58:33 localhost podman[311526]: 2025-11-23 09:58:33.068427915 +0000 UTC m=+0.127324018 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:58:33 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.122 281956 DEBUG nova.objects.instance [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.138 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.139 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ensure instance console log exists: /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.139 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.140 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.140 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.143 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T09:56:45Z,direct_url=,disk_format='qcow2',id=c5806483-57a8-4254-b41b-254b888c8606,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1915d3e5d4254231a0517e2dcf35848f',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-23T09:56:47Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'c5806483-57a8-4254-b41b-254b888c8606'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.148 281956 WARNING nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.151 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.152 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.153 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.154 281956 DEBUG nova.virt.libvirt.host [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.154 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.155 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T09:56:44Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43b374b4-75d9-47f9-aa6b-ddb1a45f7c04',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T09:56:45Z,direct_url=,disk_format='qcow2',id=c5806483-57a8-4254-b41b-254b888c8606,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1915d3e5d4254231a0517e2dcf35848f',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-23T09:56:47Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.156 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.157 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.157 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.158 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.158 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.159 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.159 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.160 281956 DEBUG nova.virt.hardware [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.164 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.463841) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913463879, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2462, "num_deletes": 251, "total_data_size": 3704725, "memory_usage": 3757000, "flush_reason": "Manual Compaction"} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913476317, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2394639, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18758, "largest_seqno": 21215, "table_properties": {"data_size": 2385727, "index_size": 5545, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19315, "raw_average_key_size": 21, "raw_value_size": 2367399, "raw_average_value_size": 2576, "num_data_blocks": 239, "num_entries": 919, "num_filter_entries": 919, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891729, "oldest_key_time": 1763891729, "file_creation_time": 1763891913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 12535 microseconds, and 6484 cpu microseconds. Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.476370) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2394639 bytes OK Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.476394) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478756) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478775) EVENT_LOG_v1 {"time_micros": 1763891913478769, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.478798) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3693768, prev total WAL file size 3693768, number of live WAL files 2. Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.479755) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2338KB)], [33(15MB)] Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913479799, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 18833343, "oldest_snapshot_seqno": -1} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12164 keys, 16882661 bytes, temperature: kUnknown Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913556083, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 16882661, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16814736, "index_size": 36400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 328801, "raw_average_key_size": 27, "raw_value_size": 16608602, "raw_average_value_size": 1365, "num_data_blocks": 1365, "num_entries": 12164, "num_filter_entries": 12164, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763891913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.556376) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 16882661 bytes Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.558154) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 246.6 rd, 221.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 15.7 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(14.9) write-amplify(7.1) OK, records in: 12696, records dropped: 532 output_compression: NoCompression Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.558190) EVENT_LOG_v1 {"time_micros": 1763891913558174, "job": 18, "event": "compaction_finished", "compaction_time_micros": 76378, "compaction_time_cpu_micros": 44428, "output_level": 6, "num_output_files": 1, "total_output_size": 16882661, "num_input_records": 12696, "num_output_records": 12164, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913558693, "job": 18, "event": "table_file_deletion", "file_number": 35} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763891913560928, "job": 18, "event": "table_file_deletion", "file_number": 33} Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.479709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560972) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-09:58:33.560976) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 04:58:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 04:58:33 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3874030768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.637 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.677 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:33 localhost nova_compute[281952]: 2025-11-23 09:58:33.682 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 04:58:34 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3441935928' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.159 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.160 281956 DEBUG nova.objects.instance [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.169 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] End _get_guest_xml xml= Nov 23 04:58:34 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:58:34 localhost nova_compute[281952]: instance-00000009 Nov 23 04:58:34 localhost nova_compute[281952]: 131072 Nov 23 04:58:34 localhost nova_compute[281952]: 1 Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-server-2005076685 Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:33 Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: 128 Nov 23 04:58:34 localhost nova_compute[281952]: 1 Nov 23 04:58:34 localhost nova_compute[281952]: 0 Nov 23 04:58:34 localhost nova_compute[281952]: 0 Nov 23 04:58:34 localhost nova_compute[281952]: 1 Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-612486733-project-member Nov 23 04:58:34 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-612486733 Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: RDO Nov 23 04:58:34 localhost nova_compute[281952]: OpenStack Compute Nov 23 04:58:34 localhost nova_compute[281952]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 23 04:58:34 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:58:34 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:58:34 localhost nova_compute[281952]: Virtual Machine Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: hvm Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: /dev/urandom Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: Nov 23 04:58:34 localhost nova_compute[281952]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.205 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.206 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.206 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Using config drive#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.236 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.279 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating config drive at /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.282 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp101umq3_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.412 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp101umq3_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:58:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.458 281956 DEBUG nova.storage.rbd_utils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.467 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.671 281956 DEBUG oslo_concurrency.processutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:58:34 localhost nova_compute[281952]: 2025-11-23 09:58:34.673 281956 INFO nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting local config drive /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config because it was imported into RBD.#033[00m Nov 23 04:58:34 localhost systemd[1]: Started libvirt secret daemon. Nov 23 04:58:34 localhost systemd-machined[84275]: New machine qemu-3-instance-00000009. Nov 23 04:58:34 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000009. Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.255 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.255 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Resumed (Lifecycle Event)#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.257 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.258 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.261 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance spawned successfully.#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.261 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.275 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.282 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.287 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.288 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.288 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.289 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.290 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.291 281956 DEBUG nova.virt.libvirt.driver [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.365 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.366 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.366 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Started (Lifecycle Event)#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.394 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.398 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.402 281956 INFO nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Took 4.77 seconds to spawn the instance on the hypervisor.#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.403 281956 DEBUG nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.425 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.461 281956 INFO nova.compute.manager [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Took 5.75 seconds to build instance.#033[00m Nov 23 04:58:35 localhost nova_compute[281952]: 2025-11-23 09:58:35.477 281956 DEBUG oslo_concurrency.lockutils [None req-dce6d5bf-ba1e-4303-852e-6dd31ea356bb 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.090 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:58:36 localhost nova_compute[281952]: 2025-11-23 09:58:36.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:37 localhost nova_compute[281952]: 2025-11-23 09:58:37.357 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:37 localhost nova_compute[281952]: 2025-11-23 09:58:37.358 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:37 localhost nova_compute[281952]: 2025-11-23 09:58:37.358 281956 INFO nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelving#033[00m Nov 23 04:58:37 localhost nova_compute[281952]: 2025-11-23 09:58:37.386 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Nov 23 04:58:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 04:58:41 localhost nova_compute[281952]: 2025-11-23 09:58:41.135 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:41 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:41.290 2 INFO neutron.agent.securitygroups_rpc [None req-b6d2f56d-2805-44c2-9e36-7ffa8fc09e14 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']#033[00m Nov 23 04:58:41 localhost dnsmasq[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/addn_hosts - 0 addresses Nov 23 04:58:41 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/host Nov 23 04:58:41 localhost podman[311885]: 2025-11-23 09:58:41.554658603 +0000 UTC m=+0.044953744 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:41 localhost dnsmasq-dhcp[310446]: read /var/lib/neutron/dhcp/8cd987c4-7e4e-467f-9ee2-d70cb75b87c3/opts Nov 23 04:58:41 localhost podman[240668]: time="2025-11-23T09:58:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:58:41 localhost podman[240668]: @ - - [23/Nov/2025:09:58:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159330 "" "Go-http-client/1.1" Nov 23 04:58:41 localhost podman[240668]: @ - - [23/Nov/2025:09:58:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20180 "" "Go-http-client/1.1" Nov 23 04:58:41 localhost systemd[1]: tmp-crun.K7OmSL.mount: Deactivated successfully. Nov 23 04:58:41 localhost dnsmasq[310446]: exiting on receipt of SIGTERM Nov 23 04:58:41 localhost podman[311925]: 2025-11-23 09:58:41.994225903 +0000 UTC m=+0.081364575 container kill ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:58:41 localhost systemd[1]: libpod-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope: Deactivated successfully. Nov 23 04:58:42 localhost podman[311939]: 2025-11-23 09:58:42.049986725 +0000 UTC m=+0.041908641 container died ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:58:42 localhost podman[311939]: 2025-11-23 09:58:42.080420584 +0000 UTC m=+0.072342420 container cleanup ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:42 localhost systemd[1]: libpod-conmon-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8.scope: Deactivated successfully. Nov 23 04:58:42 localhost podman[311940]: 2025-11-23 09:58:42.121692634 +0000 UTC m=+0.100900431 container remove ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:58:42 localhost kernel: device tap8c439e83-e9 left promiscuous mode Nov 23 04:58:42 localhost ovn_controller[154788]: 2025-11-23T09:58:42Z|00104|binding|INFO|Releasing lport 8c439e83-e972-4e99-8d01-ff5269427a3c from this chassis (sb_readonly=0) Nov 23 04:58:42 localhost ovn_controller[154788]: 2025-11-23T09:58:42Z|00105|binding|INFO|Setting lport 8c439e83-e972-4e99-8d01-ff5269427a3c down in Southbound Nov 23 04:58:42 localhost nova_compute[281952]: 2025-11-23 09:58:42.133 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:42 localhost nova_compute[281952]: 2025-11-23 09:58:42.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.156 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cd987c4-7e4e-467f-9ee2-d70cb75b87c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2148c18d8f24a6db12dc22c787e8b2e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0e3b2035-d1e3-4dc9-824d-c8c5d8c83090, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c439e83-e972-4e99-8d01-ff5269427a3c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.157 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8c439e83-e972-4e99-8d01-ff5269427a3c in datapath 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3 unbound from our chassis#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.160 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cd987c4-7e4e-467f-9ee2-d70cb75b87c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.161 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ba603f-c97d-4b7d-8d6d-57a09f6e73a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:42.184 263258 INFO neutron.agent.dhcp.agent [None req-4b70179b-86c0-4734-a9cb-8e744eb6604f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:58:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:58:42 localhost podman[311969]: 2025-11-23 09:58:42.524097869 +0000 UTC m=+0.079064355 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 23 04:58:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:58:42.530 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:58:42 localhost podman[311967]: 2025-11-23 09:58:42.536292512 +0000 UTC m=+0.089483183 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:42 localhost podman[311969]: 2025-11-23 09:58:42.539144509 +0000 UTC m=+0.094110995 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, distribution-scope=public, io.openshift.expose-services=) Nov 23 04:58:42 localhost systemd[1]: var-lib-containers-storage-overlay-067ea74ee71831860e0346c8368bf66a295b3b34ee14dc8629622033be014e0a-merged.mount: Deactivated successfully. Nov 23 04:58:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff9d94e23c42fc50e7f88cb64a3ceee0167bd639273cf5fdd2c459c962efacc8-userdata-shm.mount: Deactivated successfully. Nov 23 04:58:42 localhost systemd[1]: run-netns-qdhcp\x2d8cd987c4\x2d7e4e\x2d467f\x2d9ee2\x2dd70cb75b87c3.mount: Deactivated successfully. Nov 23 04:58:42 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:58:42 localhost podman[311967]: 2025-11-23 09:58:42.607277269 +0000 UTC m=+0.160467950 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:58:42 localhost systemd[1]: tmp-crun.JvAJB8.mount: Deactivated successfully. Nov 23 04:58:42 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:58:42 localhost podman[311968]: 2025-11-23 09:58:42.641126022 +0000 UTC m=+0.195291113 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 04:58:42 localhost podman[311968]: 2025-11-23 09:58:42.649306512 +0000 UTC m=+0.203471553 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:58:42 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:58:42 localhost ovn_controller[154788]: 2025-11-23T09:58:42Z|00106|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:58:42 localhost nova_compute[281952]: 2025-11-23 09:58:42.969 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.972 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:42 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:42.973 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:58:42 localhost nova_compute[281952]: 2025-11-23 09:58:42.975 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:43 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:43.305 2 INFO neutron.agent.securitygroups_rpc [None req-50406108-6fe1-4d79-842a-8b928e46e646 9a28cb0574d148bf982a2a1a0b495020 a2148c18d8f24a6db12dc22c787e8b2e - - default default] Security group member updated ['ff44a28d-1e1f-4163-b206-fdf77022bf0b']#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.783 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Creating tmpfile /var/lib/nova/instances/tmp2mwvq3bv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.812 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] destination check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=,is_shared_block_storage=,is_shared_instance_path=,is_volume_backed=,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.839 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.840 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.849 281956 INFO nova.compute.rpcapi [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Nov 23 04:58:43 localhost nova_compute[281952]: 2025-11-23 09:58:43.850 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:58:44 localhost nova_compute[281952]: 2025-11-23 09:58:44.485 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m Nov 23 04:58:44 localhost nova_compute[281952]: 2025-11-23 09:58:44.504 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:58:44 localhost nova_compute[281952]: 2025-11-23 09:58:44.504 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:58:44 localhost nova_compute[281952]: 2025-11-23 09:58:44.505 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.366 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [{"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.384 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.388 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.389 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Creating instance directory: /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.390 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Ensure instance console log exists: /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.390 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.393 281956 DEBUG nova.virt.libvirt.vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T09:58:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1576780525',display_name='tempest-LiveMigrationTest-server-1576780525',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005532584.localdomain',hostname='tempest-livemigrationtest-server-1576780525',id=10,image_ref='c5806483-57a8-4254-b41b-254b888c8606',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T09:58:41Z,launched_on='np0005532584.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005532584.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='253c88568a634476a6c1284eed6a9464',ramdisk_id='',reservation_id='r-dvg5v145',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c5806483-57a8-4254-b41b-254b888c8606',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1889456510',owner_user_name='tempest-LiveMigrationTest-1889456510-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-11-23T09:58:41Z,user_data=None,user_id='7f7875c0084c46fdb2e7b37e4fc44faf',uuid=8f62292f-5719-4b19-9188-3715b94493a7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.395 281956 DEBUG nova.network.os_vif_util [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Converting VIF {"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.397 281956 DEBUG nova.network.os_vif_util [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.398 281956 DEBUG os_vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.400 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.401 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.408 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap737e82a6-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.409 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap737e82a6-26, col_values=(('external_ids', {'iface-id': '737e82a6-2634-47df-b8a7-ec21a927cc3f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:21:74', 'vm-uuid': '8f62292f-5719-4b19-9188-3715b94493a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.425 281956 INFO os_vif [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26')#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.427 281956 DEBUG nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m Nov 23 04:58:45 localhost nova_compute[281952]: 2025-11-23 09:58:45.428 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m Nov 23 04:58:46 localhost nova_compute[281952]: 2025-11-23 09:58:46.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:47 localhost nova_compute[281952]: 2025-11-23 09:58:47.451 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m Nov 23 04:58:47 localhost nova_compute[281952]: 2025-11-23 09:58:47.668 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f updated with migration profile {'migrating_to': 'np0005532585.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m Nov 23 04:58:47 localhost nova_compute[281952]: 2025-11-23 09:58:47.671 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmp2mwvq3bv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f62292f-5719-4b19-9188-3715b94493a7',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m Nov 23 04:58:47 localhost ovn_controller[154788]: 2025-11-23T09:58:47Z|00107|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:58:47 localhost sshd[312028]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:58:47 localhost nova_compute[281952]: 2025-11-23 09:58:47.845 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:47 localhost systemd-logind[761]: New session 72 of user nova. Nov 23 04:58:47 localhost systemd[1]: Created slice User Slice of UID 42436. Nov 23 04:58:47 localhost systemd[1]: Starting User Runtime Directory /run/user/42436... Nov 23 04:58:47 localhost systemd[1]: Finished User Runtime Directory /run/user/42436. Nov 23 04:58:47 localhost systemd[1]: Starting User Manager for UID 42436... Nov 23 04:58:48 localhost systemd[312032]: Queued start job for default target Main User Target. Nov 23 04:58:48 localhost systemd[312032]: Created slice User Application Slice. Nov 23 04:58:48 localhost systemd[312032]: Started Mark boot as successful after the user session has run 2 minutes. Nov 23 04:58:48 localhost systemd[312032]: Started Daily Cleanup of User's Temporary Directories. Nov 23 04:58:48 localhost systemd[312032]: Reached target Paths. Nov 23 04:58:48 localhost systemd[312032]: Reached target Timers. Nov 23 04:58:48 localhost systemd[312032]: Starting D-Bus User Message Bus Socket... Nov 23 04:58:48 localhost systemd[312032]: Starting Create User's Volatile Files and Directories... Nov 23 04:58:48 localhost systemd[312032]: Listening on D-Bus User Message Bus Socket. Nov 23 04:58:48 localhost systemd[312032]: Reached target Sockets. Nov 23 04:58:48 localhost systemd[312032]: Finished Create User's Volatile Files and Directories. Nov 23 04:58:48 localhost systemd[312032]: Reached target Basic System. Nov 23 04:58:48 localhost systemd[312032]: Reached target Main User Target. Nov 23 04:58:48 localhost systemd[312032]: Startup finished in 136ms. Nov 23 04:58:48 localhost systemd[1]: Started User Manager for UID 42436. Nov 23 04:58:48 localhost systemd[1]: Started Session 72 of User nova. Nov 23 04:58:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:48 localhost NetworkManager[5975]: [1763891928.3760] manager: (tap737e82a6-26): new Tun device (/org/freedesktop/NetworkManager/Devices/24) Nov 23 04:58:48 localhost kernel: device tap737e82a6-26 entered promiscuous mode Nov 23 04:58:48 localhost systemd-udevd[312061]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:58:48 localhost NetworkManager[5975]: [1763891928.4008] device (tap737e82a6-26): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 23 04:58:48 localhost NetworkManager[5975]: [1763891928.4019] device (tap737e82a6-26): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 23 04:58:48 localhost ovn_controller[154788]: 2025-11-23T09:58:48Z|00108|binding|INFO|Claiming lport 737e82a6-2634-47df-b8a7-ec21a927cc3f for this additional chassis. Nov 23 04:58:48 localhost nova_compute[281952]: 2025-11-23 09:58:48.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:48 localhost ovn_controller[154788]: 2025-11-23T09:58:48Z|00109|binding|INFO|737e82a6-2634-47df-b8a7-ec21a927cc3f: Claiming fa:16:3e:da:21:74 10.100.0.10 Nov 23 04:58:48 localhost ovn_controller[154788]: 2025-11-23T09:58:48Z|00110|binding|INFO|Claiming lport fd30dda9-c731-47dd-b319-ebcca717b708 for this additional chassis. Nov 23 04:58:48 localhost ovn_controller[154788]: 2025-11-23T09:58:48Z|00111|binding|INFO|fd30dda9-c731-47dd-b319-ebcca717b708: Claiming fa:16:3e:4f:95:ad 19.80.0.95 Nov 23 04:58:48 localhost ovn_controller[154788]: 2025-11-23T09:58:48Z|00112|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f ovn-installed in OVS Nov 23 04:58:48 localhost nova_compute[281952]: 2025-11-23 09:58:48.444 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:48 localhost systemd-machined[84275]: New machine qemu-4-instance-0000000a. Nov 23 04:58:48 localhost systemd[1]: Started Virtual Machine qemu-4-instance-0000000a. Nov 23 04:58:48 localhost nova_compute[281952]: 2025-11-23 09:58:48.792 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:58:48 localhost nova_compute[281952]: 2025-11-23 09:58:48.794 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Started (Lifecycle Event)#033[00m Nov 23 04:58:48 localhost nova_compute[281952]: 2025-11-23 09:58:48.827 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:49 localhost nova_compute[281952]: 2025-11-23 09:58:49.507 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:58:49 localhost nova_compute[281952]: 2025-11-23 09:58:49.508 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Resumed (Lifecycle Event)#033[00m Nov 23 04:58:49 localhost nova_compute[281952]: 2025-11-23 09:58:49.660 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:49 localhost nova_compute[281952]: 2025-11-23 09:58:49.663 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:58:49 localhost nova_compute[281952]: 2025-11-23 09:58:49.681 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] During the sync_power process the instance has moved from host np0005532584.localdomain to host np0005532585.localdomain#033[00m Nov 23 04:58:49 localhost systemd[1]: session-72.scope: Deactivated successfully. Nov 23 04:58:49 localhost systemd-logind[761]: Session 72 logged out. Waiting for processes to exit. Nov 23 04:58:49 localhost systemd-logind[761]: Removed session 72. Nov 23 04:58:50 localhost nova_compute[281952]: 2025-11-23 09:58:50.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:50 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:50.976 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:51 localhost nova_compute[281952]: 2025-11-23 09:58:51.223 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00113|binding|INFO|Claiming lport 737e82a6-2634-47df-b8a7-ec21a927cc3f for this chassis. Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00114|binding|INFO|737e82a6-2634-47df-b8a7-ec21a927cc3f: Claiming fa:16:3e:da:21:74 10.100.0.10 Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00115|binding|INFO|Claiming lport fd30dda9-c731-47dd-b319-ebcca717b708 for this chassis. Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00116|binding|INFO|fd30dda9-c731-47dd-b319-ebcca717b708: Claiming fa:16:3e:4f:95:ad 19.80.0.95 Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00117|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f up in Southbound Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00118|binding|INFO|Setting lport fd30dda9-c731-47dd-b319-ebcca717b708 up in Southbound Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.517 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:95:ad 19.80.0.95'], port_security=['fa:16:3e:4f:95:ad 19.80.0.95'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['737e82a6-2634-47df-b8a7-ec21a927cc3f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1587702031', 'neutron:cidrs': '19.80.0.95/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1587702031', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd30dda9-c731-47dd-b319-ebcca717b708) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.521 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:21:74 10.100.0.10'], port_security=['fa:16:3e:da:21:74 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1925970765', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8f62292f-5719-4b19-9188-3715b94493a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d679e465-8656-4403-afa0-724657d33ec4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1925970765', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532584.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a90e812-f218-49cd-a3ab-6bc1317ad730, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=737e82a6-2634-47df-b8a7-ec21a927cc3f) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.523 160439 INFO neutron.agent.ovn.metadata.agent [-] Port fd30dda9-c731-47dd-b319-ebcca717b708 in datapath 903951dd-448c-4453-aa24-f24a53269074 bound to our chassis#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.528 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5a66dd4e-4808-453a-ba83-842df44989df IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.528 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 903951dd-448c-4453-aa24-f24a53269074#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.542 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b0940f3d-b7dd-4868-a864-db87aa764a12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.543 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap903951dd-41 in ovnmeta-903951dd-448c-4453-aa24-f24a53269074 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.545 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap903951dd-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.546 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[912bc28a-d0f3-4383-9de2-c48ad634f2b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.547 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb0ad20-1870-4ea7-b074-9ade84b42341]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.571 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[eb03155e-1f25-4173-81a6-13183f7a2cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.589 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f99fbc86-e079-4312-9bfc-4a028e12e6f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.622 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9e9d60-f443-4d7c-837f-8ee2232c61b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.629 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfb78d2-3f36-400a-9281-d625b1d840f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost NetworkManager[5975]: [1763891931.6352] manager: (tap903951dd-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25) Nov 23 04:58:51 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:51.651 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-e0ec21a6-2619-4364-8f01-ea196be44fa4 req-03d7f78c-5d6d-441d-aa9e-d4756ed73a5d 73d8249924dd406db12ad13a4ddb31a1 758f3043280349e086a85b86f2668848 - - default default] This port is not SRIOV, skip binding for port 737e82a6-2634-47df-b8a7-ec21a927cc3f.#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.674 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[8630cf05-67eb-4c2e-8c61-da06debb00c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.678 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[3d48d7d6-1f27-453a-92e4-50077fcac4d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap903951dd-41: link becomes ready Nov 23 04:58:51 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap903951dd-40: link becomes ready Nov 23 04:58:51 localhost NetworkManager[5975]: [1763891931.7057] device (tap903951dd-40): carrier: link connected Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.715 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb91685-fc04-4320-8336-e72d33065122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.736 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f53c4354-5748-47e8-a42a-06789a604782]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap903951dd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:47:c5:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187481, 'reachable_time': 41132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312141, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.758 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8f85d0-d594-4763-a117-af6cfcf34260]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:c591'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1187481, 'tstamp': 1187481}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312142, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost nova_compute[281952]: 2025-11-23 09:58:51.771 281956 INFO nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Post operation of migration started#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.780 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc3506f-729f-4bed-b8ae-ef5d28cb4534]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap903951dd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:47:c5:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187481, 'reachable_time': 41132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312143, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.818 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6caeb36c-16b5-4089-a29a-92643a86edca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.895 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ee42f3da-33c0-4311-8162-91964fcda884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.897 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap903951dd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.898 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.899 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap903951dd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:51 localhost kernel: device tap903951dd-40 entered promiscuous mode Nov 23 04:58:51 localhost nova_compute[281952]: 2025-11-23 09:58:51.907 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.912 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap903951dd-40, col_values=(('external_ids', {'iface-id': 'b83bb60d-d579-4f8d-9e2c-3885d238bb26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:51 localhost ovn_controller[154788]: 2025-11-23T09:58:51Z|00119|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0) Nov 23 04:58:51 localhost nova_compute[281952]: 2025-11-23 09:58:51.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.919 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.920 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e94265-500a-4b94-bef1-0cf1ac358aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.921 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: global Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: log /dev/log local0 debug Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: log-tag haproxy-metadata-proxy-903951dd-448c-4453-aa24-f24a53269074 Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: user root Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: group root Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: maxconn 1024 Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: pidfile /var/lib/neutron/external/pids/903951dd-448c-4453-aa24-f24a53269074.pid.haproxy Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: daemon Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: defaults Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: log global Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: mode http Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: option httplog Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: option dontlognull Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: option http-server-close Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: option forwardfor Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: retries 3 Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: timeout http-request 30s Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: timeout connect 30s Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: timeout client 32s Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: timeout server 32s Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: timeout http-keep-alive 30s Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: listen listener Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: bind 169.254.169.254:80 Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: server metadata /var/lib/neutron/metadata_proxy Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: http-request add-header X-OVN-Network-ID 903951dd-448c-4453-aa24-f24a53269074 Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 23 04:58:51 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:51.922 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'env', 'PROCESS_TAG=haproxy-903951dd-448c-4453-aa24-f24a53269074', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/903951dd-448c-4453-aa24-f24a53269074.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 23 04:58:51 localhost nova_compute[281952]: 2025-11-23 09:58:51.925 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.171 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.171 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquired lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.172 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:58:52 localhost podman[312176]: Nov 23 04:58:52 localhost podman[312176]: 2025-11-23 09:58:52.526321151 +0000 UTC m=+0.103009596 container create 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:58:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:58:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:58:52 localhost podman[312176]: 2025-11-23 09:58:52.475092266 +0000 UTC m=+0.051780801 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:58:52 localhost systemd[1]: Started libpod-conmon-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope. Nov 23 04:58:52 localhost systemd[1]: Started libcrun container. Nov 23 04:58:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79231015dd8393ee753048cdacd0bc727bddc8892a2b6bc6b9a822a34427453e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:58:52 localhost podman[312176]: 2025-11-23 09:58:52.631071838 +0000 UTC m=+0.207760243 container init 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:52 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE] (312212) : New worker (312223) forked Nov 23 04:58:52 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE] (312212) : Loading success. Nov 23 04:58:52 localhost podman[312191]: 2025-11-23 09:58:52.709594526 +0000 UTC m=+0.114378854 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 04:58:52 localhost podman[312191]: 2025-11-23 09:58:52.72315992 +0000 UTC m=+0.127944278 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:58:52 localhost podman[312190]: 2025-11-23 09:58:52.692294118 +0000 UTC m=+0.115254960 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:58:52 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:58:52 localhost podman[312176]: 2025-11-23 09:58:52.74641564 +0000 UTC m=+0.323104035 container start 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.800 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f in datapath d679e465-8656-4403-afa0-724657d33ec4 unbound from our chassis#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.807 160439 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d679e465-8656-4403-afa0-724657d33ec4#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.818 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4608bc7c-318a-4ef9-8571-5f799fb34b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.819 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd679e465-81 in ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.821 160542 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd679e465-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.821 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb27d9e-4ece-4195-b536-3d3fa20be1b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.822 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d877bc8f-d1bf-4c8c-869f-450af634f172]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost podman[312190]: 2025-11-23 09:58:52.827488095 +0000 UTC m=+0.250448927 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.831 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[c64901c1-fdf5-4b41-bb9f-a24ae1a3b360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.845 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[bb86428e-46b2-4268-a9c8-50a9cb485c92]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.876 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd89133-7970-45b4-8fdb-0fac505a397d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.883 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[77ce8a1e-a250-4867-bf0f-2228fbfe2645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost systemd-udevd[312134]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:58:52 localhost NetworkManager[5975]: [1763891932.8864] manager: (tapd679e465-80): new Veth device (/org/freedesktop/NetworkManager/Devices/26) Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.895 281956 DEBUG nova.network.neutron [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [{"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.915 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Releasing lock "refresh_cache-8f62292f-5719-4b19-9188-3715b94493a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.927 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[572f409b-5976-4b66-a59c-675cec450fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.930 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[0cedfb8c-7dbd-4866-99fb-e229d8bd44ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.931 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.932 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.932 281956 DEBUG oslo_concurrency.lockutils [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:52 localhost nova_compute[281952]: 2025-11-23 09:58:52.940 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m Nov 23 04:58:52 localhost journal[203731]: Domain id=4 name='instance-0000000a' uuid=8f62292f-5719-4b19-9188-3715b94493a7 is tainted: custom-monitor Nov 23 04:58:52 localhost NetworkManager[5975]: [1763891932.9655] device (tapd679e465-80): carrier: link connected Nov 23 04:58:52 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd679e465-81: link becomes ready Nov 23 04:58:52 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapd679e465-80: link becomes ready Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.972 160553 DEBUG oslo.privsep.daemon [-] privsep: reply[e86bd64d-45ca-4cac-8d4c-cc0bcb14c9af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:52.990 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[60aea4e3-13fb-4447-99dc-847bb732512f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd679e465-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a8:02:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187607, 'reachable_time': 41578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312256, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.006 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c677d5-5f84-4a2c-9a10-efc24ffb8f73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:218'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1187607, 'tstamp': 1187607}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312257, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.024 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[81fe0e7b-95dc-4f28-9aa1-c848ab150578]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd679e465-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:a8:02:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187607, 'reachable_time': 41578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312258, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.062 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[eb03534f-bf3f-4b2c-ae72-0586e54a94f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.135 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b9aa55a2-e026-4086-adb8-a6e558c272cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.136 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd679e465-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.137 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.137 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd679e465-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:53 localhost kernel: device tapd679e465-80 entered promiscuous mode Nov 23 04:58:53 localhost nova_compute[281952]: 2025-11-23 09:58:53.140 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.148 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd679e465-80, col_values=(('external_ids', {'iface-id': '9b50ca15-3b72-42c0-998b-33441ea57460'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:53 localhost nova_compute[281952]: 2025-11-23 09:58:53.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:53 localhost ovn_controller[154788]: 2025-11-23T09:58:53Z|00120|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0) Nov 23 04:58:53 localhost nova_compute[281952]: 2025-11-23 09:58:53.154 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.155 160439 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.157 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[01493be9-5832-4a19-bfd0-5adb629d48d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.157 160439 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: global Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: log /dev/log local0 debug Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: log-tag haproxy-metadata-proxy-d679e465-8656-4403-afa0-724657d33ec4 Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: user root Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: group root Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: maxconn 1024 Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: pidfile /var/lib/neutron/external/pids/d679e465-8656-4403-afa0-724657d33ec4.pid.haproxy Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: daemon Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: defaults Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: log global Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: mode http Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: option httplog Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: option dontlognull Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: option http-server-close Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: option forwardfor Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: retries 3 Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: timeout http-request 30s Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: timeout connect 30s Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: timeout client 32s Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: timeout server 32s Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: timeout http-keep-alive 30s Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: listen listener Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: bind 169.254.169.254:80 Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: server metadata /var/lib/neutron/metadata_proxy Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: http-request add-header X-OVN-Network-ID d679e465-8656-4403-afa0-724657d33ec4 Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 23 04:58:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:53.158 160439 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'env', 'PROCESS_TAG=haproxy-d679e465-8656-4403-afa0-724657d33ec4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d679e465-8656-4403-afa0-724657d33ec4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 23 04:58:53 localhost nova_compute[281952]: 2025-11-23 09:58:53.163 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:53 localhost podman[312291]: Nov 23 04:58:53 localhost podman[312291]: 2025-11-23 09:58:53.639190016 +0000 UTC m=+0.099888411 container create b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:53 localhost systemd[1]: Started libpod-conmon-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope. Nov 23 04:58:53 localhost podman[312291]: 2025-11-23 09:58:53.590069056 +0000 UTC m=+0.050767501 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 23 04:58:53 localhost systemd[1]: tmp-crun.zegw9E.mount: Deactivated successfully. Nov 23 04:58:53 localhost systemd[1]: Started libcrun container. Nov 23 04:58:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b3ad05ab8f11dd49e19cdf6059cf6f5ce8d04692e252d6615b26d979227f460/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:58:53 localhost podman[312291]: 2025-11-23 09:58:53.72647264 +0000 UTC m=+0.187171035 container init b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:53 localhost podman[312291]: 2025-11-23 09:58:53.749019258 +0000 UTC m=+0.209717653 container start b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 04:58:53 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE] (312309) : New worker (312311) forked Nov 23 04:58:53 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE] (312309) : Loading success. Nov 23 04:58:53 localhost nova_compute[281952]: 2025-11-23 09:58:53.952 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m Nov 23 04:58:54 localhost ovn_controller[154788]: 2025-11-23T09:58:54Z|00121|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0) Nov 23 04:58:54 localhost ovn_controller[154788]: 2025-11-23T09:58:54Z|00122|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:58:54 localhost ovn_controller[154788]: 2025-11-23T09:58:54Z|00123|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0) Nov 23 04:58:54 localhost nova_compute[281952]: 2025-11-23 09:58:54.300 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:58:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:58:54 localhost nova_compute[281952]: 2025-11-23 09:58:54.959 281956 INFO nova.virt.libvirt.driver [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m Nov 23 04:58:54 localhost nova_compute[281952]: 2025-11-23 09:58:54.965 281956 DEBUG nova.compute.manager [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:58:54 localhost nova_compute[281952]: 2025-11-23 09:58:54.988 281956 DEBUG nova.objects.instance [None req-e0ec21a6-2619-4364-8f01-ea196be44fa4 ad1995ebc8334aa1bc1f8753f5df7d6f 57d9e088e75b4a3482d0e3a02bcce5be - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.065 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.066 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.066 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.067 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.067 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.068 281956 WARNING nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state active and task_state None.#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.068 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.068 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.069 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.069 281956 DEBUG oslo_concurrency.lockutils [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.070 281956 DEBUG nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.070 281956 WARNING nova.compute.manager [req-7c87617e-4395-401e-994a-80cfa5793e8c req-76911790-d821-4d02-919c-eb5f44b9575e b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state active and task_state None.#033[00m Nov 23 04:58:55 localhost nova_compute[281952]: 2025-11-23 09:58:55.448 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.122 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.123 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.123 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.124 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.124 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.126 281956 INFO nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Terminating instance#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.128 281956 DEBUG nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 23 04:58:56 localhost kernel: device tap737e82a6-26 left promiscuous mode Nov 23 04:58:56 localhost NetworkManager[5975]: [1763891936.1896] device (tap737e82a6-26): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00124|binding|INFO|Releasing lport 737e82a6-2634-47df-b8a7-ec21a927cc3f from this chassis (sb_readonly=0) Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00125|binding|INFO|Setting lport 737e82a6-2634-47df-b8a7-ec21a927cc3f down in Southbound Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00126|binding|INFO|Releasing lport fd30dda9-c731-47dd-b319-ebcca717b708 from this chassis (sb_readonly=0) Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00127|binding|INFO|Setting lport fd30dda9-c731-47dd-b319-ebcca717b708 down in Southbound Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00128|binding|INFO|Removing iface tap737e82a6-26 ovn-installed in OVS Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00129|binding|INFO|Releasing lport 9b50ca15-3b72-42c0-998b-33441ea57460 from this chassis (sb_readonly=0) Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00130|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:58:56 localhost ovn_controller[154788]: 2025-11-23T09:58:56Z|00131|binding|INFO|Releasing lport b83bb60d-d579-4f8d-9e2c-3885d238bb26 from this chassis (sb_readonly=0) Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.207 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:95:ad 19.80.0.95'], port_security=['fa:16:3e:4f:95:ad 19.80.0.95'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['737e82a6-2634-47df-b8a7-ec21a927cc3f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1587702031', 'neutron:cidrs': '19.80.0.95/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1587702031', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd30dda9-c731-47dd-b319-ebcca717b708) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.210 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:21:74 10.100.0.10'], port_security=['fa:16:3e:da:21:74 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1925970765', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8f62292f-5719-4b19-9188-3715b94493a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d679e465-8656-4403-afa0-724657d33ec4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1925970765', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a3350144-9b09-432b-a32e-ef84bb8bf494', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532584.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1a90e812-f218-49cd-a3ab-6bc1317ad730, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=737e82a6-2634-47df-b8a7-ec21a927cc3f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.212 160439 INFO neutron.agent.ovn.metadata.agent [-] Port fd30dda9-c731-47dd-b319-ebcca717b708 in datapath 903951dd-448c-4453-aa24-f24a53269074 unbound from our chassis#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.215 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5a66dd4e-4808-453a-ba83-842df44989df IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.215 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 903951dd-448c-4453-aa24-f24a53269074, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.216 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5045b2-aabf-4dbc-8755-2ab6c5f1c4de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.217 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-903951dd-448c-4453-aa24-f24a53269074 namespace which is not needed anymore#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully. Nov 23 04:58:56 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 5.790s CPU time. Nov 23 04:58:56 localhost systemd-machined[84275]: Machine qemu-4-instance-0000000a terminated. Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.370 281956 INFO nova.virt.libvirt.driver [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Instance destroyed successfully.#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.370 281956 DEBUG nova.objects.instance [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lazy-loading 'resources' on Instance uuid 8f62292f-5719-4b19-9188-3715b94493a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.382 281956 DEBUG nova.virt.libvirt.vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-23T09:58:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1576780525',display_name='tempest-LiveMigrationTest-server-1576780525',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005532585.localdomain',hostname='tempest-livemigrationtest-server-1576780525',id=10,image_ref='c5806483-57a8-4254-b41b-254b888c8606',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-23T09:58:41Z,launched_on='np0005532584.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005532585.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='253c88568a634476a6c1284eed6a9464',ramdisk_id='',reservation_id='r-dvg5v145',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c5806483-57a8-4254-b41b-254b888c8606',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1889456510',owner_user_name='tempest-LiveMigrationTest-1889456510-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2025-11-23T09:58:55Z,user_data=None,user_id='7f7875c0084c46fdb2e7b37e4fc44faf',uuid=8f62292f-5719-4b19-9188-3715b94493a7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.383 281956 DEBUG nova.network.os_vif_util [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Converting VIF {"id": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "address": "fa:16:3e:da:21:74", "network": {"id": "d679e465-8656-4403-afa0-724657d33ec4", "bridge": "br-int", "label": "tempest-LiveMigrationTest-49202206-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "253c88568a634476a6c1284eed6a9464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap737e82a6-26", "ovs_interfaceid": "737e82a6-2634-47df-b8a7-ec21a927cc3f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.384 281956 DEBUG nova.network.os_vif_util [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.384 281956 DEBUG os_vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.386 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.387 281956 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap737e82a6-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.395 281956 INFO os_vif [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:21:74,bridge_name='br-int',has_traffic_filtering=True,id=737e82a6-2634-47df-b8a7-ec21a927cc3f,network=Network(d679e465-8656-4403-afa0-724657d33ec4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap737e82a6-26')#033[00m Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE] (312212) : haproxy version is 2.8.14-c23fe91 Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [NOTICE] (312212) : path to executable is /usr/sbin/haproxy Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING] (312212) : Exiting Master process... Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING] (312212) : Exiting Master process... Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [ALERT] (312212) : Current worker (312223) exited with code 143 (Terminated) Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074[312198]: [WARNING] (312212) : All workers exited. Exiting... (0) Nov 23 04:58:56 localhost systemd[1]: libpod-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope: Deactivated successfully. Nov 23 04:58:56 localhost podman[312343]: 2025-11-23 09:58:56.426605923 +0000 UTC m=+0.085163990 container died 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 04:58:56 localhost podman[312343]: 2025-11-23 09:58:56.47467325 +0000 UTC m=+0.133231287 container cleanup 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:56 localhost podman[312381]: 2025-11-23 09:58:56.522135179 +0000 UTC m=+0.086439760 container cleanup 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:58:56 localhost systemd[1]: libpod-conmon-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797.scope: Deactivated successfully. Nov 23 04:58:56 localhost podman[312397]: 2025-11-23 09:58:56.574062585 +0000 UTC m=+0.077994942 container remove 6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.583 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0c9bfab0-9338-4659-9da1-fecd14ec8c03]: (4, ('Sun Nov 23 09:58:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074 (6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797)\n6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797\nSun Nov 23 09:58:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-903951dd-448c-4453-aa24-f24a53269074 (6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797)\n6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.586 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6bb5e1-7605-4afb-af8c-71332c2d4774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.588 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap903951dd-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.630 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost kernel: device tap903951dd-40 left promiscuous mode Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.645 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[96d712d3-34f2-4e8e-9bbd-0f2c55e0d4b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.663 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c798e4-4397-49d7-b5db-565084c6983f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.664 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[67e91309-f904-4333-91b3-4703744d4638]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.681 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b6523cdd-9bac-4635-b536-73335f5785a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187472, 'reachable_time': 38242, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312418, 'error': None, 'target': 'ovnmeta-903951dd-448c-4453-aa24-f24a53269074', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.686 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-903951dd-448c-4453-aa24-f24a53269074 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.686 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ec97f4-fd75-48ef-8d8f-d68384ddd09f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.687 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 737e82a6-2634-47df-b8a7-ec21a927cc3f in datapath d679e465-8656-4403-afa0-724657d33ec4 unbound from our chassis#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.689 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d679e465-8656-4403-afa0-724657d33ec4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.690 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[39403141-b69a-455c-8c31-767a58302452]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.691 160439 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 namespace which is not needed anymore#033[00m Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE] (312309) : haproxy version is 2.8.14-c23fe91 Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [NOTICE] (312309) : path to executable is /usr/sbin/haproxy Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [WARNING] (312309) : Exiting Master process... Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [ALERT] (312309) : Current worker (312311) exited with code 143 (Terminated) Nov 23 04:58:56 localhost neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4[312305]: [WARNING] (312309) : All workers exited. Exiting... (0) Nov 23 04:58:56 localhost systemd[1]: libpod-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope: Deactivated successfully. Nov 23 04:58:56 localhost podman[312434]: 2025-11-23 09:58:56.844740918 +0000 UTC m=+0.058306511 container died b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 04:58:56 localhost podman[312434]: 2025-11-23 09:58:56.87593372 +0000 UTC m=+0.089499303 container cleanup b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 04:58:56 localhost podman[312448]: 2025-11-23 09:58:56.888504565 +0000 UTC m=+0.045660905 container cleanup b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:58:56 localhost systemd[1]: libpod-conmon-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34.scope: Deactivated successfully. Nov 23 04:58:56 localhost podman[312465]: 2025-11-23 09:58:56.944271637 +0000 UTC m=+0.055180996 container remove b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.948 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ee19c034-3c99-4a34-b1a0-cde714f50a78]: (4, ('Sun Nov 23 09:58:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 (b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34)\nb1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34\nSun Nov 23 09:58:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 (b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34)\nb1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.950 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[701bae2d-641d-4272-9ec1-08c28dd92bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.950 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd679e465-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.952 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost kernel: device tapd679e465-80 left promiscuous mode Nov 23 04:58:56 localhost nova_compute[281952]: 2025-11-23 09:58:56.956 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.960 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4c8f255f-8b68-49a1-8d46-7887819f4f4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.978 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c4be31-e14f-4da5-89ad-7af0b6857f88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.979 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[15063fe0-7228-4090-aae1-94c651ca5a91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.994 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[066a7b20-73be-4c51-ab2f-f85b26c4ebd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1187597, 'reachable_time': 41009, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312482, 'error': None, 'target': 'ovnmeta-d679e465-8656-4403-afa0-724657d33ec4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.996 160573 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d679e465-8656-4403-afa0-724657d33ec4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 23 04:58:56 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:56.996 160573 DEBUG oslo.privsep.daemon [-] privsep: reply[ef80d3b0-40b9-4127-bcec-bb60a1f71634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.096 281956 INFO nova.virt.libvirt.driver [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deleting instance files /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7_del#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.097 281956 INFO nova.virt.libvirt.driver [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deletion of /var/lib/nova/instances/8f62292f-5719-4b19-9188-3715b94493a7_del complete#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.142 281956 INFO nova.compute.manager [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Took 1.01 seconds to destroy the instance on the hypervisor.#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.143 281956 DEBUG oslo.service.loopingcall [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.143 281956 DEBUG nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.144 281956 DEBUG nova.network.neutron [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.161 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.161 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.162 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.162 281956 DEBUG oslo_concurrency.lockutils [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.163 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:58:57 localhost nova_compute[281952]: 2025-11-23 09:58:57.163 281956 DEBUG nova.compute.manager [req-e5f5acd6-8ca0-434d-aac9-0848cc7df19c req-ebfa6c92-1f21-49bc-8837-eca1b1766b77 b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-unplugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Nov 23 04:58:57 localhost systemd[1]: var-lib-containers-storage-overlay-7b3ad05ab8f11dd49e19cdf6059cf6f5ce8d04692e252d6615b26d979227f460-merged.mount: Deactivated successfully. Nov 23 04:58:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1521c36c613835604ea307c5bc7db41470bc805f8de4cf777d4fbd72eaf8b34-userdata-shm.mount: Deactivated successfully. Nov 23 04:58:57 localhost systemd[1]: run-netns-ovnmeta\x2dd679e465\x2d8656\x2d4403\x2dafa0\x2d724657d33ec4.mount: Deactivated successfully. Nov 23 04:58:57 localhost systemd[1]: var-lib-containers-storage-overlay-79231015dd8393ee753048cdacd0bc727bddc8892a2b6bc6b9a822a34427453e-merged.mount: Deactivated successfully. Nov 23 04:58:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a42808b202d0aa6c1bdbfb8879a20749d9466dd471e1597bedf460181e89797-userdata-shm.mount: Deactivated successfully. Nov 23 04:58:57 localhost systemd[1]: run-netns-ovnmeta\x2d903951dd\x2d448c\x2d4453\x2daa24\x2df24a53269074.mount: Deactivated successfully. Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.213 281956 DEBUG nova.network.neutron [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.234 281956 INFO nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Took 1.09 seconds to deallocate network for instance.#033[00m Nov 23 04:58:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.288 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.289 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.291 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.336 281956 INFO nova.scheduler.client.report [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Deleted allocations for instance 8f62292f-5719-4b19-9188-3715b94493a7#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.405 281956 DEBUG oslo_concurrency.lockutils [None req-d7c53eb9-d8ce-4a0e-a008-ee5dd6039d6e 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:58 localhost nova_compute[281952]: 2025-11-23 09:58:58.503 281956 DEBUG nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m Nov 23 04:58:59 localhost neutron_sriov_agent[256124]: 2025-11-23 09:58:59.125 2 INFO neutron.agent.securitygroups_rpc [None req-f36f5a6d-ca31-44d9-bac1-0308580f3e95 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.222 281956 DEBUG nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.223 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Acquiring lock "8f62292f-5719-4b19-9188-3715b94493a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.223 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.224 281956 DEBUG oslo_concurrency.lockutils [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] Lock "8f62292f-5719-4b19-9188-3715b94493a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.224 281956 DEBUG nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] No waiting events found dispatching network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.225 281956 WARNING nova.compute.manager [req-03f6b264-ed56-4c7f-84b0-c3b0cc453c17 req-c251741d-a6b5-4a35-ad85-55348ed1cc9b b7661bc5cba943dea266498398ed28cc 758f3043280349e086a85b86f2668848 - - default default] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Received unexpected event network-vif-plugged-737e82a6-2634-47df-b8a7-ec21a927cc3f for instance with vm_state deleted and task_state None.#033[00m Nov 23 04:58:59 localhost dnsmasq[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/addn_hosts - 0 addresses Nov 23 04:58:59 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/host Nov 23 04:58:59 localhost dnsmasq-dhcp[311299]: read /var/lib/neutron/dhcp/903951dd-448c-4453-aa24-f24a53269074/opts Nov 23 04:58:59 localhost podman[312500]: 2025-11-23 09:58:59.393776298 +0000 UTC m=+0.064487159 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:58:59 localhost systemd[1]: tmp-crun.HTIyLm.mount: Deactivated successfully. Nov 23 04:58:59 localhost dnsmasq[311299]: exiting on receipt of SIGTERM Nov 23 04:58:59 localhost podman[312537]: 2025-11-23 09:58:59.83307353 +0000 UTC m=+0.069973367 container kill d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:58:59 localhost systemd[1]: Stopping User Manager for UID 42436... Nov 23 04:58:59 localhost systemd[1]: libpod-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope: Deactivated successfully. Nov 23 04:58:59 localhost systemd[312032]: Activating special unit Exit the Session... Nov 23 04:58:59 localhost systemd[312032]: Stopped target Main User Target. Nov 23 04:58:59 localhost systemd[312032]: Stopped target Basic System. Nov 23 04:58:59 localhost systemd[312032]: Stopped target Paths. Nov 23 04:58:59 localhost systemd[312032]: Stopped target Sockets. Nov 23 04:58:59 localhost systemd[312032]: Stopped target Timers. Nov 23 04:58:59 localhost systemd[312032]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 23 04:58:59 localhost systemd[312032]: Stopped Daily Cleanup of User's Temporary Directories. Nov 23 04:58:59 localhost systemd[312032]: Closed D-Bus User Message Bus Socket. Nov 23 04:58:59 localhost systemd[312032]: Stopped Create User's Volatile Files and Directories. Nov 23 04:58:59 localhost systemd[312032]: Removed slice User Application Slice. Nov 23 04:58:59 localhost systemd[312032]: Reached target Shutdown. Nov 23 04:58:59 localhost systemd[312032]: Finished Exit the Session. Nov 23 04:58:59 localhost systemd[312032]: Reached target Exit the Session. Nov 23 04:58:59 localhost systemd[1]: user@42436.service: Deactivated successfully. Nov 23 04:58:59 localhost systemd[1]: Stopped User Manager for UID 42436. Nov 23 04:58:59 localhost systemd[1]: Stopping User Runtime Directory /run/user/42436... Nov 23 04:58:59 localhost systemd[1]: run-user-42436.mount: Deactivated successfully. Nov 23 04:58:59 localhost systemd[1]: user-runtime-dir@42436.service: Deactivated successfully. Nov 23 04:58:59 localhost systemd[1]: Stopped User Runtime Directory /run/user/42436. Nov 23 04:58:59 localhost systemd[1]: Removed slice User Slice of UID 42436. Nov 23 04:58:59 localhost podman[312551]: 2025-11-23 09:58:59.916065574 +0000 UTC m=+0.065657976 container died d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 04:58:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:59.927 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5a66dd4e-4808-453a-ba83-842df44989df with type ""#033[00m Nov 23 04:58:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:59.928 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-903951dd-448c-4453-aa24-f24a53269074', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-903951dd-448c-4453-aa24-f24a53269074', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '253c88568a634476a6c1284eed6a9464', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154cf14d-e57b-4715-bce9-5bdd1a0ded15, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca98d0dd-231a-46c7-80b8-a48c00a5696e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:58:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:59.929 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ca98d0dd-231a-46c7-80b8-a48c00a5696e in datapath 903951dd-448c-4453-aa24-f24a53269074 unbound from our chassis#033[00m Nov 23 04:58:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:59.930 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 903951dd-448c-4453-aa24-f24a53269074, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:58:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:58:59.931 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4efafa-4807-4f80-aafd-5cc9e4866fa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:58:59 localhost ovn_controller[154788]: 2025-11-23T09:58:59Z|00132|binding|INFO|Removing iface tapca98d0dd-23 ovn-installed in OVS Nov 23 04:58:59 localhost ovn_controller[154788]: 2025-11-23T09:58:59Z|00133|binding|INFO|Removing lport ca98d0dd-231a-46c7-80b8-a48c00a5696e ovn-installed in OVS Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.934 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:59 localhost nova_compute[281952]: 2025-11-23 09:58:59.938 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:58:59 localhost podman[312551]: 2025-11-23 09:58:59.944028168 +0000 UTC m=+0.093620490 container cleanup d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:58:59 localhost systemd[1]: libpod-conmon-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814.scope: Deactivated successfully. Nov 23 04:58:59 localhost openstack_network_exporter[242668]: ERROR 09:58:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:58:59 localhost openstack_network_exporter[242668]: ERROR 09:58:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:58:59 localhost openstack_network_exporter[242668]: ERROR 09:58:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:58:59 localhost openstack_network_exporter[242668]: ERROR 09:58:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:58:59 localhost openstack_network_exporter[242668]: Nov 23 04:58:59 localhost openstack_network_exporter[242668]: ERROR 09:58:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:58:59 localhost openstack_network_exporter[242668]: Nov 23 04:59:00 localhost podman[312553]: 2025-11-23 09:59:00.003825253 +0000 UTC m=+0.144643277 container remove d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-903951dd-448c-4453-aa24-f24a53269074, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 04:59:00 localhost kernel: device tapca98d0dd-23 left promiscuous mode Nov 23 04:59:00 localhost nova_compute[281952]: 2025-11-23 09:59:00.021 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:00 localhost nova_compute[281952]: 2025-11-23 09:59:00.038 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:00.056 263258 INFO neutron.agent.dhcp.agent [None req-0b758ae4-6a63-4b17-bf24-2157967c06ff - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:00.180 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:00 localhost systemd[1]: var-lib-containers-storage-overlay-50b27a1dbab67616f24ed7868948ddd418eee50493d781bdc989237aff316f07-merged.mount: Deactivated successfully. Nov 23 04:59:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7021ea8a6afa330edb86a3f7ade61020f5b95bb8a4bbad90af1927fad336814-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:00 localhost systemd[1]: run-netns-qdhcp\x2d903951dd\x2d448c\x2d4453\x2daa24\x2df24a53269074.mount: Deactivated successfully. Nov 23 04:59:00 localhost ovn_controller[154788]: 2025-11-23T09:59:00Z|00134|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:00 localhost nova_compute[281952]: 2025-11-23 09:59:00.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:00 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:00.644 2 INFO neutron.agent.securitygroups_rpc [None req-2e659d4a-74ef-46b7-bd3b-2baf8d6d13fe 7f7875c0084c46fdb2e7b37e4fc44faf 253c88568a634476a6c1284eed6a9464 - - default default] Security group member updated ['a3350144-9b09-432b-a32e-ef84bb8bf494']#033[00m Nov 23 04:59:00 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully. Nov 23 04:59:00 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 15.918s CPU time. Nov 23 04:59:00 localhost systemd-machined[84275]: Machine qemu-3-instance-00000009 terminated. Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.521 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance shutdown successfully after 24 seconds.#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.528 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.529 281956 DEBUG nova.objects.instance [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.601 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Beginning cold snapshot process#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.828 281956 DEBUG nova.virt.libvirt.imagebackend [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No parent info for c5806483-57a8-4254-b41b-254b888c8606; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m Nov 23 04:59:01 localhost nova_compute[281952]: 2025-11-23 09:59:01.873 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(d341425897f7472cb10ea988db862e04) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m Nov 23 04:59:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:59:02 localhost podman[312636]: 2025-11-23 09:59:02.033243719 +0000 UTC m=+0.092476254 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 04:59:02 localhost podman[312636]: 2025-11-23 09:59:02.052412754 +0000 UTC m=+0.111645299 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Nov 23 04:59:02 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:59:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:02.135 263258 INFO neutron.agent.linux.ip_lib [None req-b12e4ae5-1c19-461a-a514-e1d88c3b8373 - - - - - -] Device tapb7d31e03-f6 cannot be used as it has no MAC address#033[00m Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.153 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost kernel: device tapb7d31e03-f6 entered promiscuous mode Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost systemd-udevd[312581]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:02 localhost NetworkManager[5975]: [1763891942.1640] manager: (tapb7d31e03-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Nov 23 04:59:02 localhost ovn_controller[154788]: 2025-11-23T09:59:02Z|00135|binding|INFO|Claiming lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 for this chassis. Nov 23 04:59:02 localhost ovn_controller[154788]: 2025-11-23T09:59:02Z|00136|binding|INFO|b7d31e03-f6de-49f6-a46f-b6861bfb0ba8: Claiming unknown Nov 23 04:59:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:02.178 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a693c1f03094401b2a83bfa038e2d85', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71365ad6-1587-4285-8187-e9f4a0e26a00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b7d31e03-f6de-49f6-a46f-b6861bfb0ba8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:02.180 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 in datapath 0a868746-0c5d-4cb5-b569-e1ea427d7eaf bound to our chassis#033[00m Nov 23 04:59:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:02.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 25c5c7de-1918-4d73-bfc0-bdb457d5d80e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:59:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:02.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:02 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:02.184 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d4b6e8-58f5-4439-95d5-5d3b40bfc62c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost ovn_controller[154788]: 2025-11-23T09:59:02Z|00137|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 ovn-installed in OVS Nov 23 04:59:02 localhost ovn_controller[154788]: 2025-11-23T09:59:02Z|00138|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 up in Southbound Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.199 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost journal[230249]: ethtool ioctl error on tapb7d31e03-f6: No such device Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.277 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e93 e93: 6 total, 6 up, 6 in Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.580 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] cloning vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk@d341425897f7472cb10ea988db862e04 to images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m Nov 23 04:59:02 localhost nova_compute[281952]: 2025-11-23 09:59:02.767 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] flattening images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m Nov 23 04:59:03 localhost podman[312790]: Nov 23 04:59:03 localhost podman[312790]: 2025-11-23 09:59:03.12991842 +0000 UTC m=+0.080917341 container create 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:59:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:59:03 localhost systemd[1]: Started libpod-conmon-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope. Nov 23 04:59:03 localhost podman[312790]: 2025-11-23 09:59:03.099001676 +0000 UTC m=+0.050000637 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:59:03 localhost systemd[1]: Started libcrun container. Nov 23 04:59:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e82ddd04bc2b03081d5a523028e9859714aedc10be59a8137ed30fe358e9a28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:59:03 localhost podman[312790]: 2025-11-23 09:59:03.218144413 +0000 UTC m=+0.169143334 container init 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:03 localhost podman[312790]: 2025-11-23 09:59:03.227132288 +0000 UTC m=+0.178131199 container start 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:59:03 localhost dnsmasq[312818]: started, version 2.85 cachesize 150 Nov 23 04:59:03 localhost dnsmasq[312818]: DNS service limited to local subnets Nov 23 04:59:03 localhost dnsmasq[312818]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:59:03 localhost dnsmasq[312818]: warning: no upstream servers configured Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:59:03 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 0 addresses Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.312 263258 INFO neutron.agent.dhcp.agent [None req-f761acdd-2a00-45e0-a2f9-c6109c814a22 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:01Z, description=, device_id=8074bdc2-a02c-4214-ad81-7c41b83201d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a3dd1070-8964-4625-913d-1ee38ef5dacb, ip_allocation=immediate, mac_address=fa:16:3e:8b:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=False, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=821, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:01Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf#033[00m Nov 23 04:59:03 localhost podman[312803]: 2025-11-23 09:59:03.319034713 +0000 UTC m=+0.137815268 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:59:03 localhost podman[312803]: 2025-11-23 09:59:03.36148976 +0000 UTC m=+0.180270345 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 04:59:03 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:59:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.392 263258 INFO neutron.agent.dhcp.agent [None req-ba197019-39c6-4712-9ab1-e6e9bcda8962 - - - - - -] DHCP configuration for ports {'8b5b0b62-f6a4-474d-8fa1-6c8b82d9241f'} is completed#033[00m Nov 23 04:59:03 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:03 localhost podman[312846]: 2025-11-23 09:59:03.549325935 +0000 UTC m=+0.067727480 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 04:59:03 localhost nova_compute[281952]: 2025-11-23 09:59:03.672 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] removing snapshot(d341425897f7472cb10ea988db862e04) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m Nov 23 04:59:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.715 263258 INFO neutron.agent.dhcp.agent [None req-2202ad04-66a5-45ae-92b7-eb2ecb293a2c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:01Z, description=, device_id=8074bdc2-a02c-4214-ad81-7c41b83201d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a3dd1070-8964-4625-913d-1ee38ef5dacb, ip_allocation=immediate, mac_address=fa:16:3e:8b:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=False, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=821, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:01Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf#033[00m Nov 23 04:59:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:03.833 263258 INFO neutron.agent.dhcp.agent [None req-dd0ce42d-cbc2-4637-a8fc-03534d4f8634 - - - - - -] DHCP configuration for ports {'a3dd1070-8964-4625-913d-1ee38ef5dacb'} is completed#033[00m Nov 23 04:59:03 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:03 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:03 localhost podman[312904]: 2025-11-23 09:59:03.936767453 +0000 UTC m=+0.059503818 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:59:03 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:03.946 2 INFO neutron.agent.securitygroups_rpc [None req-9d56fe05-4ef8-4d55-837b-9cee7fc5dad7 4b677b000abe4b0687ff1afcd1016893 2a693c1f03094401b2a83bfa038e2d85 - - default default] Security group member updated ['e11e3507-78f9-4b55-80fe-2aa7bb5d486d']#033[00m Nov 23 04:59:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.173 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:03Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f2b26721-5c09-4af3-8bc2-da22f2612b11, ip_allocation=immediate, mac_address=fa:16:3e:54:fd:3f, name=tempest-FloatingIPNegativeTestJSON-1148406642, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:58:59Z, description=, dns_domain=, id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-789670611, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28566, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=799, status=ACTIVE, subnets=['5b3a211b-df16-45c2-b823-d3f707dcbf68'], tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:00Z, vlan_transparent=None, network_id=0a868746-0c5d-4cb5-b569-e1ea427d7eaf, port_security_enabled=True, project_id=2a693c1f03094401b2a83bfa038e2d85, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['e11e3507-78f9-4b55-80fe-2aa7bb5d486d'], standard_attr_id=832, status=DOWN, tags=[], tenant_id=2a693c1f03094401b2a83bfa038e2d85, updated_at=2025-11-23T09:59:03Z on network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf#033[00m Nov 23 04:59:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.194 263258 INFO neutron.agent.dhcp.agent [None req-89bdf3a4-76b0-403c-a197-85b4954cb932 - - - - - -] DHCP configuration for ports {'a3dd1070-8964-4625-913d-1ee38ef5dacb'} is completed#033[00m Nov 23 04:59:04 localhost ovn_controller[154788]: 2025-11-23T09:59:04Z|00139|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:04 localhost nova_compute[281952]: 2025-11-23 09:59:04.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:04 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 2 addresses Nov 23 04:59:04 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:04 localhost podman[312941]: 2025-11-23 09:59:04.41065974 +0000 UTC m=+0.061223440 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:04 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e94 e94: 6 total, 6 up, 6 in Nov 23 04:59:04 localhost nova_compute[281952]: 2025-11-23 09:59:04.607 281956 DEBUG nova.storage.rbd_utils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(snap) on rbd image(7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m Nov 23 04:59:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:04.691 263258 INFO neutron.agent.dhcp.agent [None req-42b58f33-73e3-4e72-b723-d8658e402274 - - - - - -] DHCP configuration for ports {'f2b26721-5c09-4af3-8bc2-da22f2612b11'} is completed#033[00m Nov 23 04:59:05 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e95 e95: 6 total, 6 up, 6 in Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.323 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Snapshot image upload complete#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.323 281956 DEBUG nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.384 281956 INFO nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelve offloading#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.394 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.395 281956 DEBUG nova.compute.manager [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.397 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.398 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.398 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.470 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.733 281956 DEBUG nova.network.neutron [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.747 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.755 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:06 localhost nova_compute[281952]: 2025-11-23 09:59:06.756 281956 DEBUG nova.objects.instance [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'resources' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.449 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting instance files /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.450 281956 INFO nova.virt.libvirt.driver [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deletion of /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del complete#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.560 281956 INFO nova.scheduler.client.report [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Deleted allocations for instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.606 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.607 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:07 localhost nova_compute[281952]: 2025-11-23 09:59:07.649 281956 DEBUG oslo_concurrency.processutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:59:08 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/273386736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.111 281956 DEBUG oslo_concurrency.processutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.120 281956 DEBUG nova.compute.provider_tree [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.134 281956 DEBUG nova.scheduler.client.report [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.153 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.213 281956 DEBUG oslo_concurrency.lockutils [None req-e4431909-d51a-4da7-b094-cab979c892d2 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: held 30.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:08 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:08.353 2 INFO neutron.agent.securitygroups_rpc [None req-0bae4724-8fa8-4216-8cb0-34bcdfbbc61a 4b677b000abe4b0687ff1afcd1016893 2a693c1f03094401b2a83bfa038e2d85 - - default default] Security group member updated ['e11e3507-78f9-4b55-80fe-2aa7bb5d486d']#033[00m Nov 23 04:59:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e96 e96: 6 total, 6 up, 6 in Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.551 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.551 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.552 281956 INFO nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Unshelving#033[00m Nov 23 04:59:08 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 1 addresses Nov 23 04:59:08 localhost podman[313038]: 2025-11-23 09:59:08.595612704 +0000 UTC m=+0.066751969 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 04:59:08 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:08 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.642 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.643 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.646 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.661 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.678 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.679 281956 INFO nova.compute.claims [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Claim successful on node np0005532585.localdomain#033[00m Nov 23 04:59:08 localhost nova_compute[281952]: 2025-11-23 09:59:08.812 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:59:09 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3173906222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:59:09 localhost ovn_controller[154788]: 2025-11-23T09:59:09Z|00140|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.245 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.252 281956 DEBUG nova.compute.provider_tree [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.268 281956 DEBUG nova.scheduler.client.report [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.287 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.291 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.298 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:09 localhost dnsmasq[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/addn_hosts - 0 addresses Nov 23 04:59:09 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/host Nov 23 04:59:09 localhost dnsmasq-dhcp[312818]: read /var/lib/neutron/dhcp/0a868746-0c5d-4cb5-b569-e1ea427d7eaf/opts Nov 23 04:59:09 localhost podman[313098]: 2025-11-23 09:59:09.403012853 +0000 UTC m=+0.057490166 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.540 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.541 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.541 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.577 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:09 localhost ovn_controller[154788]: 2025-11-23T09:59:09Z|00141|binding|INFO|Releasing lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 from this chassis (sb_readonly=0) Nov 23 04:59:09 localhost ovn_controller[154788]: 2025-11-23T09:59:09Z|00142|binding|INFO|Setting lport b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 down in Southbound Nov 23 04:59:09 localhost kernel: device tapb7d31e03-f6 left promiscuous mode Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.587 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a868746-0c5d-4cb5-b569-e1ea427d7eaf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a693c1f03094401b2a83bfa038e2d85', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71365ad6-1587-4285-8187-e9f4a0e26a00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b7d31e03-f6de-49f6-a46f-b6861bfb0ba8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.589 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b7d31e03-f6de-49f6-a46f-b6861bfb0ba8 in datapath 0a868746-0c5d-4cb5-b569-e1ea427d7eaf unbound from our chassis#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.592 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:09 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:09.593 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[79823d54-92c3-4a36-ab3c-46fb31902cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.602 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:09 localhost nova_compute[281952]: 2025-11-23 09:59:09.618 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 23 04:59:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e97 e97: 6 total, 6 up, 6 in Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.177 281956 DEBUG nova.network.neutron [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.192 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.194 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.195 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating image(s)#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.232 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.237 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.289 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.330 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.336 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.338 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.390 281956 DEBUG nova.virt.libvirt.imagebackend [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image locations are: [{'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.477 281956 DEBUG nova.virt.libvirt.imagebackend [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Selected location: {'url': 'rbd://46550e70-79cb-5f55-bf6d-1204b97e083b/images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.477 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] cloning images/7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb@snap to None/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m Nov 23 04:59:10 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e98 e98: 6 total, 6 up, 6 in Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.808 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.814 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d5a967a-7bc8-43fd-928e-f2f11f2c1621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.809344', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f027764-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '4c3903669b08a5e6e74441ca00461a2b4d4080a6ad00ff24e8048adf1ddb409c'}]}, 'timestamp': '2025-11-23 09:59:10.815120', '_unique_id': '9228642f5fff473c96ababa3a3b718bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.816 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 04:59:10 localhost nova_compute[281952]: 2025-11-23 09:59:10.820 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "d1ec78b5f1b07f2e087c6afff3b972c481121dc6" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73002d66-fb2f-44b1-957a-ca981c4773b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.818306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f083bc2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '007384d2f4824aede4156938503e74e6763deb3cad7e4bf2800e75912e3be00e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.818306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f085170-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'e44f70d648f3fe59b7e609300be55e27241bbf51372b8735a012e9f4751abe55'}]}, 'timestamp': '2025-11-23 09:59:10.853179', '_unique_id': '61272adc7f6744fda9c0b7ce042812fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.855 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.857 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '960bfdfb-fb3f-40f3-8330-fed6ee8b3533', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.857699', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f092ad2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'fbbf1b18b5ee9ce2e5710acd8f3f5b58d2ccec53a758661c920f1842b27b4897'}]}, 'timestamp': '2025-11-23 09:59:10.859122', '_unique_id': 'c68026bb9ccd4e73a396823941fedef1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.862 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88a6350d-3028-42b5-aca3-cfaa08dd8420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.862800', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f09e6ca-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'f173d672340e88dd3904e8ad794a4c398ef9fa71542b0c9900163191192321ac'}]}, 'timestamp': '2025-11-23 09:59:10.863576', '_unique_id': '21faabe0b62f4f44afba4765778a8fa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.866 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee846f90-3755-4d18-907c-b77e73ab670b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.867230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0a88fa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'abec897939af0d768316f63b73b4b2c0fdf1299ebc135961ce7420d03b2b5a27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.867230', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0a9b1a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '59f6df8d40d90ddccf35503f84b93df0e8c0a20be850f76689808d5528bb7b59'}]}, 'timestamp': '2025-11-23 09:59:10.868185', '_unique_id': '61d5fba8cf0142fe885c0b837174ef9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9010612-b30a-4cde-9ff8-2e9784892680', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.871719', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0b39a8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '4684e771ffdde6eff2e073f594d235701ce0d28b909111aaf8c6dcc9947a9c68'}]}, 'timestamp': '2025-11-23 09:59:10.872361', '_unique_id': 'fce7870049c14db18f12e688d6aaf673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.875 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dba06eda-2118-4997-a7d3-d145d335d797', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.875265', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0bc0c6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '0754996772b7d133605cf1c00ed7b64ad783b2b2c41cbc2334cb87499cc91ebd'}]}, 'timestamp': '2025-11-23 09:59:10.875612', '_unique_id': '6189c6a91826422fa5f969fef20f5953'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b21eaff-9513-4faa-9a75-2929f1fa6b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.877025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0c039c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '52ebe5008ff29e360310d72888c536bedbaecbb06db91516dc257217039a63a2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.877025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0c0edc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '78a67f829c6f36702d8aa71b8e8efe1737a9fe3909efd30cec802c790b6b0f95'}]}, 'timestamp': '2025-11-23 09:59:10.877590', '_unique_id': 'de1ba1d27b254b7b891d91dbf5cb2234'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa968336-cc32-4d32-bb65-089a63f6cb4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.879488', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0c6440-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'e66528cf7217ddb64e14de845b83cdb24e00cb7cfbebe2db00cc64af4b8c5494'}]}, 'timestamp': '2025-11-23 09:59:10.879798', '_unique_id': '0dd49377b76040169dc7b38df6143457'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6d84d20-6d20-4691-b09b-39147c8b1cec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.881733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0e6092-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '2f6ee57967d62b6e1b5e7ed73b3d333818ed02030d8c7beaaa851ebdfa7d8dd4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.881733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0e751e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '5788e29f1702db7134227b417e3f5023c6910559d3ffc91f950be0e43ef8f573'}]}, 'timestamp': '2025-11-23 09:59:10.893333', '_unique_id': 'c7ae7e2e2e2846aea19be6cb8f74b8b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe63f4ca-ca83-429e-bee3-0e81d830dab4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.895434', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0ed45a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'f1950e038edfe449da496ec7cafd709ba50396d4c263ede93dd10411c31ef4ea'}]}, 'timestamp': '2025-11-23 09:59:10.895782', '_unique_id': 'e535fe8b0ec54cdba6ec508437981886'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51e28022-709a-4a85-9dc2-ddd652e61e63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.897448', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0f227a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '36f4275bc057a52a9888c2ca3c45e4ce065e1a7feb7ad30c700da5c1b3efca19'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.897448', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0f2ffe-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '3c9b86e8430bd35b24eea4504cd5723fcfcc628c99e878a6a084ebf6a111334a'}]}, 'timestamp': '2025-11-23 09:59:10.898160', '_unique_id': 'a7bf4cdb7edc49faaab610089ba150da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36ad4d64-4288-417b-9234-bfcc2ca57a43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.900332', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f0f923c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '7b8e2f0e2f8b1b3c3b89eb624679d468bcfdbaa76f2dd73cf57a2b27a3d91103'}]}, 'timestamp': '2025-11-23 09:59:10.900640', '_unique_id': '33501276c2624320a8f879dbba250b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.901 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339f231f-756d-4bce-b640-b9b46109912c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.902012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f0fd440-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': '81519590db75c10b3e1ad8e4bbf8799db4b6e7518b40af49af1f9dc2acefd629'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.902012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f0fde54-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'c21ef9526d8b7a753dd7cd5903fa4f1e90d18e4b1cd5bf984d1e3f84f8be5284'}]}, 'timestamp': '2025-11-23 09:59:10.902560', '_unique_id': '6c172542001742ec954a108a10b945a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9d80885-1bf6-4459-b128-03ffedb978b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.903885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f101d56-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'b08a00c7477358e862aee5045f4548d884661ba464b9dfc6f1632a89086d738b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.903885', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f102724-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.059361723, 'message_signature': 'b50b58605d2a71106b76f1f9c0325c41e9136a76af372572a6fda630a473ec71'}]}, 'timestamp': '2025-11-23 09:59:10.904424', '_unique_id': '6fa15a57a82e46faa6fdf9613c939acf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.905 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d4826b5-53a4-44f4-bc09-f4c7bb9c8ef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:59:10.905738', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0f12e860-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.099510459, 'message_signature': 'cecee09bda2d392fe42f55850a793c70a210699641fa1f414ba3f6f1bdba2b39'}]}, 'timestamp': '2025-11-23 09:59:10.922550', '_unique_id': '2f3d0f88a5bd44f08eb9913787993c9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 15950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f6734b7-f3df-4c28-adab-66ba7d41d35a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15950000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T09:59:10.924402', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0f133e64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11894.099510459, 'message_signature': 'bf52a1e27c46778318090bf594f26bbeceedf3d0bff631fd0f6c7b485b04e06e'}]}, 'timestamp': '2025-11-23 09:59:10.924748', '_unique_id': 'a0773c5e33ca4d36a45b3ac13127185a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.925 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f080e37-3358-423e-aa62-5b60683c8169', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.926332', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f1389be-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': '441bb17b322ab83fae1454090dcb0102ceb5f7b303ece49dd90fa80cc821e390'}]}, 'timestamp': '2025-11-23 09:59:10.926634', '_unique_id': '6875085e12ba4e14a68ab778f5584fde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18dcce94-e273-4497-8bcb-303751d4de9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T09:59:10.929014', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '0f1401fa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.986974522, 'message_signature': 'af584afff2a4adb748122b5dc4c82336a3b75970bcdc7cfcac660672608eb479'}]}, 'timestamp': '2025-11-23 09:59:10.929736', '_unique_id': '5b004c0248b04a6499685b7b532ae9a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b9e6183-a52c-4c42-9df1-a5b5b39ee686', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.931505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f14539e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'bfbfce4be5875210f9191ad9539e0b41c5778cb71c5e14505fc553a03f00367e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.931505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f1464e2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': 'c726d5885349747b7e0247e384233017662fa605b1c50771a5a2b327a847ed52'}]}, 'timestamp': '2025-11-23 09:59:10.932319', '_unique_id': '401b1a0067234ce58c2193cb3e909702'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa5640d-601a-4478-8775-862e1c42158a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T09:59:10.934082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0f14b834-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '9005cfa4c5d4893cb3a542d4553f45f4b2ac95392583e40381baf85b98a9bce0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T09:59:10.934082', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0f14c216-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 11893.995947659, 'message_signature': '7fb417122aa73d3196daf0f3f1ab32ad9b09d73638b5ab58b4f7b394e4ead79b'}]}, 'timestamp': '2025-11-23 09:59:10.934606', '_unique_id': 'fc84eb5393ac4687bc67f29e453096c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging yield Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 04:59:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 09:59:10.935 12 ERROR oslo_messaging.notify.messaging Nov 23 04:59:11 localhost systemd[1]: tmp-crun.JgpR9K.mount: Deactivated successfully. Nov 23 04:59:11 localhost podman[313298]: 2025-11-23 09:59:11.063223671 +0000 UTC m=+0.073532935 container kill 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:11 localhost dnsmasq[312818]: exiting on receipt of SIGTERM Nov 23 04:59:11 localhost systemd[1]: libpod-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope: Deactivated successfully. Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.111 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'migration_context' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:11 localhost podman[313331]: 2025-11-23 09:59:11.135637292 +0000 UTC m=+0.054237952 container died 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:59:11 localhost systemd[1]: tmp-crun.8OeMIG.mount: Deactivated successfully. Nov 23 04:59:11 localhost podman[313331]: 2025-11-23 09:59:11.161776177 +0000 UTC m=+0.080376817 container cleanup 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:59:11 localhost systemd[1]: libpod-conmon-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720.scope: Deactivated successfully. Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.206 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] flattening vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m Nov 23 04:59:11 localhost podman[313332]: 2025-11-23 09:59:11.216341838 +0000 UTC m=+0.127307223 container remove 6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a868746-0c5d-4cb5-b569-e1ea427d7eaf, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.261 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.262 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.367 281956 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.368 281956 INFO nova.compute.manager [-] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] VM Stopped (Lifecycle Event)#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.391 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.402 281956 DEBUG nova.compute.manager [None req-6ed1d632-97b7-40f4-819e-fe3170f6833f - - - - - -] [instance: 8f62292f-5719-4b19-9188-3715b94493a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:11.574 263258 INFO neutron.agent.dhcp.agent [None req-019f614a-0716-4c02-9a77-9686dcef5e01 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:11.575 263258 INFO neutron.agent.dhcp.agent [None req-019f614a-0716-4c02-9a77-9686dcef5e01 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:11 localhost podman[240668]: time="2025-11-23T09:59:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:59:11 localhost podman[240668]: @ - - [23/Nov/2025:09:59:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 04:59:11 localhost podman[240668]: @ - - [23/Nov/2025:09:59:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19234 "" "Go-http-client/1.1" Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.987 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Image rbd:vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Ensure instance console log exists: /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.988 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.989 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.989 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.991 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-23T09:58:37Z,direct_url=,disk_format='raw',id=7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2005076685-shelved',owner='37a58b702f564a81ab5a59cf4201b4f0',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-23T09:59:06Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False, 'encryption_options': None, 'image_id': 'c5806483-57a8-4254-b41b-254b888c8606'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.995 281956 WARNING nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.997 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.997 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.999 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Searching host: 'np0005532585.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 23 04:59:11 localhost nova_compute[281952]: 2025-11-23 09:59:11.999 281956 DEBUG nova.virt.libvirt.host [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T09:56:44Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='43b374b4-75d9-47f9-aa6b-ddb1a45f7c04',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-23T09:58:37Z,direct_url=,disk_format='raw',id=7ff19ec8-b33a-4365-b738-cfaa2a3c2aeb,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-2005076685-shelved',owner='37a58b702f564a81ab5a59cf4201b4f0',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-23T09:59:06Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.000 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.001 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.002 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.virt.hardware [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.003 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.016 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:12 localhost systemd[1]: var-lib-containers-storage-overlay-2e82ddd04bc2b03081d5a523028e9859714aedc10be59a8137ed30fe358e9a28-merged.mount: Deactivated successfully. Nov 23 04:59:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b0808485e0aa15477818fd4cafa6411b16594e1b425885803b79d013a3b5720-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:12 localhost systemd[1]: run-netns-qdhcp\x2d0a868746\x2d0c5d\x2d4cb5\x2db569\x2de1ea427d7eaf.mount: Deactivated successfully. Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 04:59:12 localhost sshd[313412]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.316 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.317 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.318 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.318 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:12 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:12.398 2 INFO neutron.agent.securitygroups_rpc [req-a16da276-a12d-4c8d-9117-64c33f913ca9 req-9efe9caa-9e38-4b50-8b4e-539fa928addc 492e2909a77a4032ab6c29a26d12fb14 0497de4959b2494e8036eb39226430d6 - - default default] Security group member updated ['2da1104f-77c5-475e-b21f-e52710edc8b5']#033[00m Nov 23 04:59:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:12.420 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e99 e99: 6 total, 6 up, 6 in Nov 23 04:59:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 04:59:12 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3540974265' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.601 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.636 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.645 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:12 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 1 addresses Nov 23 04:59:12 localhost podman[313449]: 2025-11-23 09:59:12.7063986 +0000 UTC m=+0.062795445 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 04:59:12 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:59:12 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:59:12 localhost systemd[1]: tmp-crun.2v4RVR.mount: Deactivated successfully. Nov 23 04:59:12 localhost ovn_controller[154788]: 2025-11-23T09:59:12Z|00143|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.807 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:12 localhost podman[313466]: 2025-11-23 09:59:12.81999679 +0000 UTC m=+0.087305491 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41) Nov 23 04:59:12 localhost podman[313465]: 2025-11-23 09:59:12.830880105 +0000 UTC m=+0.099320071 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.834 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.851 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:59:12 localhost nova_compute[281952]: 2025-11-23 09:59:12.853 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 04:59:12 localhost podman[313465]: 2025-11-23 09:59:12.864215282 +0000 UTC m=+0.132655288 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 23 04:59:12 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:59:12 localhost podman[313464]: 2025-11-23 09:59:12.875580302 +0000 UTC m=+0.144635767 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:59:12 localhost podman[313466]: 2025-11-23 09:59:12.903813882 +0000 UTC m=+0.171122583 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible) Nov 23 04:59:12 localhost podman[313464]: 2025-11-23 09:59:12.911476128 +0000 UTC m=+0.180531593 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:12 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:59:12 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.097 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.100 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.126 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] End _get_guest_xml xml= Nov 23 04:59:13 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:59:13 localhost nova_compute[281952]: instance-00000009 Nov 23 04:59:13 localhost nova_compute[281952]: 131072 Nov 23 04:59:13 localhost nova_compute[281952]: 1 Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-server-2005076685 Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:11 Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: 128 Nov 23 04:59:13 localhost nova_compute[281952]: 1 Nov 23 04:59:13 localhost nova_compute[281952]: 0 Nov 23 04:59:13 localhost nova_compute[281952]: 0 Nov 23 04:59:13 localhost nova_compute[281952]: 1 Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-612486733-project-member Nov 23 04:59:13 localhost nova_compute[281952]: tempest-UnshelveToHostMultiNodesTest-612486733 Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: RDO Nov 23 04:59:13 localhost nova_compute[281952]: OpenStack Compute Nov 23 04:59:13 localhost nova_compute[281952]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 23 04:59:13 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:59:13 localhost nova_compute[281952]: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 Nov 23 04:59:13 localhost nova_compute[281952]: Virtual Machine Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: hvm Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: /dev/urandom Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: Nov 23 04:59:13 localhost nova_compute[281952]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.181 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.182 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.183 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Using config drive#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.221 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.228 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.243 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.288 281956 DEBUG nova.objects.instance [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lazy-loading 'keypairs' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.366 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Creating config drive at /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.378 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkac12wxz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.509 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkac12wxz" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.554 281956 DEBUG nova.storage.rbd_utils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] rbd image 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.559 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.779 281956 DEBUG oslo_concurrency.processutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:13 localhost nova_compute[281952]: 2025-11-23 09:59:13.781 281956 INFO nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting local config drive /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7/disk.config because it was imported into RBD.#033[00m Nov 23 04:59:13 localhost systemd-machined[84275]: New machine qemu-5-instance-00000009. Nov 23 04:59:13 localhost systemd[1]: Started Virtual Machine qemu-5-instance-00000009. Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.209 281956 DEBUG nova.virt.libvirt.host [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Removed pending event for 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.210 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.211 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Resumed (Lifecycle Event)#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.215 281956 DEBUG nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.216 281956 DEBUG nova.virt.libvirt.driver [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.217 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.217 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.222 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance spawned successfully.#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.241 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.246 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.278 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.279 281956 DEBUG nova.virt.driver [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.279 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Started (Lifecycle Event)#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.298 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.302 281956 DEBUG nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.320 281956 INFO nova.compute.manager [None req-14e7f832-bcaa-4cf1-8dc5-2dea6f56fed5 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 23 04:59:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e100 e100: 6 total, 6 up, 6 in Nov 23 04:59:14 localhost nova_compute[281952]: 2025-11-23 09:59:14.723 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:15 localhost dnsmasq[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/addn_hosts - 0 addresses Nov 23 04:59:15 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/host Nov 23 04:59:15 localhost podman[313689]: 2025-11-23 09:59:15.007323041 +0000 UTC m=+0.065866810 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:15 localhost dnsmasq-dhcp[310864]: read /var/lib/neutron/dhcp/c5d88dfa-0db8-489e-a45a-e843e31a3b26/opts Nov 23 04:59:15 localhost nova_compute[281952]: 2025-11-23 09:59:15.111 281956 DEBUG nova.compute.manager [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:15 localhost kernel: device tapd6f3b7ff-1b left promiscuous mode Nov 23 04:59:15 localhost ovn_controller[154788]: 2025-11-23T09:59:15Z|00144|binding|INFO|Releasing lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 from this chassis (sb_readonly=0) Nov 23 04:59:15 localhost ovn_controller[154788]: 2025-11-23T09:59:15Z|00145|binding|INFO|Setting lport d6f3b7ff-1bfe-4568-bcbd-2732186dba70 down in Southbound Nov 23 04:59:15 localhost nova_compute[281952]: 2025-11-23 09:59:15.269 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:15.285 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5d88dfa-0db8-489e-a45a-e843e31a3b26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0497de4959b2494e8036eb39226430d6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54e00d1b-ba48-40e5-8228-7e38f918fa79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d6f3b7ff-1bfe-4568-bcbd-2732186dba70) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:15.290 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d6f3b7ff-1bfe-4568-bcbd-2732186dba70 in datapath c5d88dfa-0db8-489e-a45a-e843e31a3b26 unbound from our chassis#033[00m Nov 23 04:59:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:15.293 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5d88dfa-0db8-489e-a45a-e843e31a3b26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:15 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:15.295 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8ededb-7ca5-4552-8e45-818dad82f0b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:15 localhost nova_compute[281952]: 2025-11-23 09:59:15.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:15 localhost nova_compute[281952]: 2025-11-23 09:59:15.351 281956 DEBUG oslo_concurrency.lockutils [None req-3184eeeb-c0a8-4c59-81d2-862077fbfbaa 25162d9a482a4cd38c99a8e94bc63cc3 e5a4b2286a4a475887ed51bb4020d980 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: held 6.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:16 localhost ovn_controller[154788]: 2025-11-23T09:59:16Z|00146|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.251 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.251 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.252 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.252 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.253 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.392 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.621 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.621 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" acquired by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.622 281956 INFO nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelving#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.649 281956 DEBUG nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Nov 23 04:59:16 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:59:16 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3593812547' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.815 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.899 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.899 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000009 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.913 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:59:16 localhost nova_compute[281952]: 2025-11-23 09:59:16.913 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.246 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.248 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11169MB free_disk=41.7004280090332GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.249 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.249 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:17 localhost systemd[1]: tmp-crun.5DSdTd.mount: Deactivated successfully. Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.343 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.343 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.344 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.344 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 04:59:17 localhost dnsmasq[310864]: exiting on receipt of SIGTERM Nov 23 04:59:17 localhost podman[313751]: 2025-11-23 09:59:17.346419028 +0000 UTC m=+0.088068274 container kill 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:17 localhost systemd[1]: libpod-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope: Deactivated successfully. Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.363 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.387 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.388 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.414 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 04:59:17 localhost podman[313764]: 2025-11-23 09:59:17.425451873 +0000 UTC m=+0.064680584 container died 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.439 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 04:59:17 localhost podman[313764]: 2025-11-23 09:59:17.462831474 +0000 UTC m=+0.102060195 container cleanup 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:59:17 localhost systemd[1]: libpod-conmon-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf.scope: Deactivated successfully. Nov 23 04:59:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e101 e101: 6 total, 6 up, 6 in Nov 23 04:59:17 localhost podman[313766]: 2025-11-23 09:59:17.51171123 +0000 UTC m=+0.141066497 container remove 930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5d88dfa-0db8-489e-a45a-e843e31a3b26, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:59:17 localhost nova_compute[281952]: 2025-11-23 09:59:17.580 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:17.656 263258 INFO neutron.agent.dhcp.agent [None req-fc0489cb-10d2-4fc6-807c-0937bd6825af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:17.867 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:59:18 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2514870888' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.030 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.035 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.053 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.092 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:18 localhost systemd[1]: var-lib-containers-storage-overlay-7e36441351384cf4ce607d42d9d2a693f2618154ca741337c33c1071de55a0ca-merged.mount: Deactivated successfully. Nov 23 04:59:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-930e90cb3243f1c0c20a334e3f6a463d6af7b88db9b664f94edd9f7d9ed544cf-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:18 localhost systemd[1]: run-netns-qdhcp\x2dc5d88dfa\x2d0db8\x2d489e\x2da45a\x2de843e31a3b26.mount: Deactivated successfully. Nov 23 04:59:18 localhost nova_compute[281952]: 2025-11-23 09:59:18.807 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:19 localhost nova_compute[281952]: 2025-11-23 09:59:19.093 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:19 localhost nova_compute[281952]: 2025-11-23 09:59:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:20 localhost nova_compute[281952]: 2025-11-23 09:59:20.208 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 04:59:20 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:20.230 2 INFO neutron.agent.securitygroups_rpc [req-e10db1c6-11f3-4ff7-8a20-47058bab960f req-be246029-4620-443a-8a27-dc66d74bf8a5 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['df6d8f7b-74cc-4864-a7e2-24c32662f7e1']#033[00m Nov 23 04:59:20 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:20.724 2 INFO neutron.agent.securitygroups_rpc [req-b63466c9-444c-4747-806e-6e70f6ca8dbf req-1b97f9eb-afe9-470d-a373-457d04103769 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['486481c0-58d7-474c-ac28-9109e6d75e3e']#033[00m Nov 23 04:59:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:21.162 263258 INFO neutron.agent.linux.ip_lib [None req-4e36a782-7349-49d3-83e9-cb06f4ff3168 - - - - - -] Device tapbaef0101-f3 cannot be used as it has no MAC address#033[00m Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost kernel: device tapbaef0101-f3 entered promiscuous mode Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost NetworkManager[5975]: [1763891961.2261] manager: (tapbaef0101-f3): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Nov 23 04:59:21 localhost systemd-udevd[313826]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:21 localhost ovn_controller[154788]: 2025-11-23T09:59:21Z|00147|binding|INFO|Claiming lport baef0101-f381-4af9-b095-1f116c8d43cf for this chassis. Nov 23 04:59:21 localhost ovn_controller[154788]: 2025-11-23T09:59:21Z|00148|binding|INFO|baef0101-f381-4af9-b095-1f116c8d43cf: Claiming unknown Nov 23 04:59:21 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:21.238 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68f3d06d-7ad2-4e09-a769-ead39666c244, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=baef0101-f381-4af9-b095-1f116c8d43cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:21 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:21.240 160439 INFO neutron.agent.ovn.metadata.agent [-] Port baef0101-f381-4af9-b095-1f116c8d43cf in datapath 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7 bound to our chassis#033[00m Nov 23 04:59:21 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:21.242 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 69bb29d4-192f-481a-aefe-e710b536360e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 04:59:21 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:21.243 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:21 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:21.244 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[577b8ee4-4aa1-482d-bd6f-393ea04d349c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost ovn_controller[154788]: 2025-11-23T09:59:21Z|00149|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf ovn-installed in OVS Nov 23 04:59:21 localhost ovn_controller[154788]: 2025-11-23T09:59:21Z|00150|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf up in Southbound Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.274 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost journal[230249]: ethtool ioctl error on tapbaef0101-f3: No such device Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.358 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost nova_compute[281952]: 2025-11-23 09:59:21.394 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:21 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:21.669 2 INFO neutron.agent.securitygroups_rpc [req-5b5e1aae-0130-44ef-b3c8-2b4a33b1f155 req-73054398-e6b7-4548-a319-a659c6c54985 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['9f0e447c-560b-475e-bb8e-29f8dd459211']#033[00m Nov 23 04:59:22 localhost podman[313897]: Nov 23 04:59:22 localhost podman[313897]: 2025-11-23 09:59:22.276130991 +0000 UTC m=+0.102893840 container create 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 04:59:22 localhost systemd[1]: Started libpod-conmon-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope. Nov 23 04:59:22 localhost podman[313897]: 2025-11-23 09:59:22.22873806 +0000 UTC m=+0.055500900 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:59:22 localhost systemd[1]: Started libcrun container. Nov 23 04:59:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d1bd68d3770234ddf8ac90b3c6d8de140a8a1263855d13554012024e6eb7216/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:59:22 localhost podman[313897]: 2025-11-23 09:59:22.356349502 +0000 UTC m=+0.183112361 container init 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:22 localhost podman[313897]: 2025-11-23 09:59:22.365002018 +0000 UTC m=+0.191764867 container start 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 04:59:22 localhost dnsmasq[313916]: started, version 2.85 cachesize 150 Nov 23 04:59:22 localhost dnsmasq[313916]: DNS service limited to local subnets Nov 23 04:59:22 localhost dnsmasq[313916]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:59:22 localhost dnsmasq[313916]: warning: no upstream servers configured Nov 23 04:59:22 localhost dnsmasq-dhcp[313916]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:59:22 localhost dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 0 addresses Nov 23 04:59:22 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host Nov 23 04:59:22 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts Nov 23 04:59:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.428 263258 INFO neutron.agent.dhcp.agent [None req-6c086316-9d8e-4945-aca8-3830dd2bc335 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:21Z, description=, device_id=89783321-05cb-41ec-bfca-a08a32ddb0e0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=31fc4967-c13f-4c2d-85f7-b02043b66946, ip_allocation=immediate, mac_address=fa:16:3e:e2:c2:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:17Z, description=, dns_domain=, id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2006905300, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19922, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=889, status=ACTIVE, subnets=['c95d6240-1bf0-485d-b67f-51fb71cd6afd'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:19Z, vlan_transparent=None, network_id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=957, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:21Z on network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7#033[00m Nov 23 04:59:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 e102: 6 total, 6 up, 6 in Nov 23 04:59:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.509 263258 INFO neutron.agent.dhcp.agent [None req-b83a353c-8acf-4179-8fe0-c32627ec534e - - - - - -] DHCP configuration for ports {'f40396a1-ffcc-4210-912d-fa3a2a29cc54'} is completed#033[00m Nov 23 04:59:22 localhost dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 1 addresses Nov 23 04:59:22 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host Nov 23 04:59:22 localhost podman[313932]: 2025-11-23 09:59:22.688426922 +0000 UTC m=+0.063403904 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:59:22 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts Nov 23 04:59:22 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:22.697 2 INFO neutron.agent.securitygroups_rpc [req-2f39ed52-ad77-4aa6-9471-651d11ecbf13 req-f42d6273-96eb-4a84-b6a8-20685191fd4a 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['9d3d4eb8-5be7-4867-b930-e62b16d22d58']#033[00m Nov 23 04:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:59:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:22.966 263258 INFO neutron.agent.dhcp.agent [None req-28ad30cc-102c-46f6-9e38-400cc098b6ec - - - - - -] DHCP configuration for ports {'31fc4967-c13f-4c2d-85f7-b02043b66946'} is completed#033[00m Nov 23 04:59:23 localhost podman[313953]: 2025-11-23 09:59:23.022233135 +0000 UTC m=+0.080360156 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 04:59:23 localhost podman[313953]: 2025-11-23 09:59:23.034155832 +0000 UTC m=+0.092282913 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:59:23 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:59:23 localhost podman[313952]: 2025-11-23 09:59:23.100387983 +0000 UTC m=+0.159813814 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:59:23 localhost podman[313952]: 2025-11-23 09:59:23.112847726 +0000 UTC m=+0.172273537 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:59:23 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:59:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:23.261 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:21Z, description=, device_id=89783321-05cb-41ec-bfca-a08a32ddb0e0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=31fc4967-c13f-4c2d-85f7-b02043b66946, ip_allocation=immediate, mac_address=fa:16:3e:e2:c2:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:17Z, description=, dns_domain=, id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2006905300, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19922, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=889, status=ACTIVE, subnets=['c95d6240-1bf0-485d-b67f-51fb71cd6afd'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:19Z, vlan_transparent=None, network_id=1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=957, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:21Z on network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7#033[00m Nov 23 04:59:23 localhost systemd[1]: tmp-crun.K0MHoS.mount: Deactivated successfully. Nov 23 04:59:23 localhost dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 1 addresses Nov 23 04:59:23 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host Nov 23 04:59:23 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts Nov 23 04:59:23 localhost podman[314011]: 2025-11-23 09:59:23.52132715 +0000 UTC m=+0.075976541 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 04:59:23 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:23.530 2 INFO neutron.agent.securitygroups_rpc [req-9fd26688-f794-4469-9fd8-a5b40d60592d req-5ebc8662-650f-469d-8c45-5ce5c30495b8 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']#033[00m Nov 23 04:59:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:23.722 263258 INFO neutron.agent.dhcp.agent [None req-4551f10f-a18f-43fa-82a2-f6c777bb83cd - - - - - -] DHCP configuration for ports {'31fc4967-c13f-4c2d-85f7-b02043b66946'} is completed#033[00m Nov 23 04:59:23 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:23.997 2 INFO neutron.agent.securitygroups_rpc [req-e3ac8e09-5876-4e41-80c7-46043b4c6329 req-ca05e120-41d0-4e85-be51-0d5858a51936 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']#033[00m Nov 23 04:59:24 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:24.404 2 INFO neutron.agent.securitygroups_rpc [None req-55c9cfb2-f59a-42d6-ace6-61788e22f102 f30cb7ce3bac485ca16e284ef2514162 493833d8fb394637b29c3fb2052aca9c - - default default] Security group member updated ['6a5ca8fc-febe-492b-8ed6-1c2faceb11b7']#033[00m Nov 23 04:59:24 localhost sshd[314033]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:59:24 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:24.911 2 INFO neutron.agent.securitygroups_rpc [None req-dfbbbdd6-9764-4926-ad6f-603dbba55323 f30cb7ce3bac485ca16e284ef2514162 493833d8fb394637b29c3fb2052aca9c - - default default] Security group member updated ['6a5ca8fc-febe-492b-8ed6-1c2faceb11b7']#033[00m Nov 23 04:59:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:24.922 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:24 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:24.929 2 INFO neutron.agent.securitygroups_rpc [req-3634fbe3-812b-4789-a7d3-12a0d9366017 req-05979dae-61e1-4fc8-b138-901b668995d3 924b04d509744aa59ab90825723761be c1b218a298814f81811d06e4ddeeca2f - - default default] Security group rule updated ['b40df903-b9f3-4a1c-8419-71383dae71f9']#033[00m Nov 23 04:59:26 localhost nova_compute[281952]: 2025-11-23 09:59:26.296 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:26 localhost nova_compute[281952]: 2025-11-23 09:59:26.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:26 localhost nova_compute[281952]: 2025-11-23 09:59:26.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:26 localhost nova_compute[281952]: 2025-11-23 09:59:26.699 281956 DEBUG nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m Nov 23 04:59:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:29.494 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:29 localhost dnsmasq[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/addn_hosts - 0 addresses Nov 23 04:59:29 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/host Nov 23 04:59:29 localhost podman[314051]: 2025-11-23 09:59:29.550973295 +0000 UTC m=+0.065996154 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 04:59:29 localhost dnsmasq-dhcp[313916]: read /var/lib/neutron/dhcp/1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7/opts Nov 23 04:59:29 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully. Nov 23 04:59:29 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 13.612s CPU time. Nov 23 04:59:29 localhost systemd-machined[84275]: Machine qemu-5-instance-00000009 terminated. Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.889 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance shutdown successfully after 13 seconds.#033[00m Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.900 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.901 281956 DEBUG nova.objects.instance [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:29 localhost ovn_controller[154788]: 2025-11-23T09:59:29Z|00151|binding|INFO|Releasing lport baef0101-f381-4af9-b095-1f116c8d43cf from this chassis (sb_readonly=0) Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:29 localhost ovn_controller[154788]: 2025-11-23T09:59:29Z|00152|binding|INFO|Setting lport baef0101-f381-4af9-b095-1f116c8d43cf down in Southbound Nov 23 04:59:29 localhost kernel: device tapbaef0101-f3 left promiscuous mode Nov 23 04:59:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:29.931 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=68f3d06d-7ad2-4e09-a769-ead39666c244, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=baef0101-f381-4af9-b095-1f116c8d43cf) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:29.933 160439 INFO neutron.agent.ovn.metadata.agent [-] Port baef0101-f381-4af9-b095-1f116c8d43cf in datapath 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7 unbound from our chassis#033[00m Nov 23 04:59:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:29.935 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:29 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:29.936 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae44200-ad19-4b77-b8c3-465981eda5cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:29 localhost openstack_network_exporter[242668]: ERROR 09:59:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:59:29 localhost openstack_network_exporter[242668]: ERROR 09:59:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:59:29 localhost openstack_network_exporter[242668]: Nov 23 04:59:29 localhost openstack_network_exporter[242668]: ERROR 09:59:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:59:29 localhost openstack_network_exporter[242668]: Nov 23 04:59:29 localhost nova_compute[281952]: 2025-11-23 09:59:29.991 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Beginning cold snapshot process#033[00m Nov 23 04:59:29 localhost openstack_network_exporter[242668]: ERROR 09:59:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:59:29 localhost openstack_network_exporter[242668]: ERROR 09:59:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:59:30 localhost nova_compute[281952]: 2025-11-23 09:59:30.170 281956 DEBUG nova.virt.libvirt.imagebackend [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] No parent info for c5806483-57a8-4254-b41b-254b888c8606; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m Nov 23 04:59:30 localhost nova_compute[281952]: 2025-11-23 09:59:30.224 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(d0131164664c4853ad3a327c704f4dc4) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m Nov 23 04:59:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e103 e103: 6 total, 6 up, 6 in Nov 23 04:59:30 localhost nova_compute[281952]: 2025-11-23 09:59:30.586 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] cloning vms/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk@d0131164664c4853ad3a327c704f4dc4 to images/b6d724dc-26d8-4b53-bc02-990c8b280c9a clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m Nov 23 04:59:30 localhost nova_compute[281952]: 2025-11-23 09:59:30.764 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] flattening images/b6d724dc-26d8-4b53-bc02-990c8b280c9a flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m Nov 23 04:59:30 localhost nova_compute[281952]: 2025-11-23 09:59:30.841 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:31 localhost nova_compute[281952]: 2025-11-23 09:59:31.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:31 localhost nova_compute[281952]: 2025-11-23 09:59:31.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:31.640 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:31 localhost nova_compute[281952]: 2025-11-23 09:59:31.695 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] removing snapshot(d0131164664c4853ad3a327c704f4dc4) on rbd image(8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m Nov 23 04:59:31 localhost systemd[1]: tmp-crun.7CVRpJ.mount: Deactivated successfully. Nov 23 04:59:31 localhost dnsmasq[313916]: exiting on receipt of SIGTERM Nov 23 04:59:31 localhost systemd[1]: libpod-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope: Deactivated successfully. Nov 23 04:59:31 localhost podman[314218]: 2025-11-23 09:59:31.777031851 +0000 UTC m=+0.059892446 container kill 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:31 localhost podman[314230]: 2025-11-23 09:59:31.846670906 +0000 UTC m=+0.059127852 container died 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:31 localhost podman[314230]: 2025-11-23 09:59:31.886213574 +0000 UTC m=+0.098670490 container cleanup 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 04:59:31 localhost systemd[1]: libpod-conmon-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96.scope: Deactivated successfully. Nov 23 04:59:31 localhost podman[314234]: 2025-11-23 09:59:31.925988699 +0000 UTC m=+0.127336873 container remove 79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fc969d8-6d2f-49ab-83f5-e28a94f4b4e7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:32.155 263258 INFO neutron.agent.dhcp.agent [None req-253b9119-1abe-4cc0-951b-ba25bb348876 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 04:59:32 localhost podman[314260]: 2025-11-23 09:59:32.287061333 +0000 UTC m=+0.075626751 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 04:59:32 localhost podman[314260]: 2025-11-23 09:59:32.293858522 +0000 UTC m=+0.082423920 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:32 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 04:59:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:32.486 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e104 e104: 6 total, 6 up, 6 in Nov 23 04:59:32 localhost nova_compute[281952]: 2025-11-23 09:59:32.604 281956 DEBUG nova.storage.rbd_utils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] creating snapshot(snap) on rbd image(b6d724dc-26d8-4b53-bc02-990c8b280c9a) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m Nov 23 04:59:32 localhost ovn_controller[154788]: 2025-11-23T09:59:32Z|00153|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:32 localhost systemd[1]: tmp-crun.wySTIM.mount: Deactivated successfully. Nov 23 04:59:32 localhost systemd[1]: var-lib-containers-storage-overlay-4d1bd68d3770234ddf8ac90b3c6d8de140a8a1263855d13554012024e6eb7216-merged.mount: Deactivated successfully. Nov 23 04:59:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79fbf55ca01e917788d5a6c0e938394efa3e32cb317010eaf1f7b149ff3c4d96-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:32 localhost systemd[1]: run-netns-qdhcp\x2d1fc969d8\x2d6d2f\x2d49ab\x2d83f5\x2de28a94f4b4e7.mount: Deactivated successfully. Nov 23 04:59:32 localhost nova_compute[281952]: 2025-11-23 09:59:32.856 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e105 e105: 6 total, 6 up, 6 in Nov 23 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 04:59:34 localhost systemd[1]: tmp-crun.PxFlut.mount: Deactivated successfully. Nov 23 04:59:34 localhost podman[314294]: 2025-11-23 09:59:34.041962763 +0000 UTC m=+0.097869986 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 04:59:34 localhost podman[314294]: 2025-11-23 09:59:34.054535011 +0000 UTC m=+0.110442214 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 04:59:34 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 04:59:34 localhost ovn_controller[154788]: 2025-11-23T09:59:34Z|00154|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.245 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.412 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Snapshot image upload complete#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.412 281956 DEBUG nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.483 281956 INFO nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Shelve offloading#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.489 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.490 281956 DEBUG nova.compute.manager [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquired lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.492 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 23 04:59:34 localhost nova_compute[281952]: 2025-11-23 09:59:34.549 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.040 281956 DEBUG nova.network.neutron [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.059 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Releasing lock "refresh_cache-8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.068 281956 INFO nova.virt.libvirt.driver [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Instance destroyed successfully.#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.069 281956 DEBUG nova.objects.instance [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lazy-loading 'resources' on Instance uuid 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 04:59:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:35.092 263258 INFO neutron.agent.linux.ip_lib [None req-6c47639e-df62-4c96-bb42-44d566d06402 - - - - - -] Device tap53af77be-60 cannot be used as it has no MAC address#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.119 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost kernel: device tap53af77be-60 entered promiscuous mode Nov 23 04:59:35 localhost ovn_controller[154788]: 2025-11-23T09:59:35Z|00155|binding|INFO|Claiming lport 53af77be-60e9-4895-8dfb-1340d4308442 for this chassis. Nov 23 04:59:35 localhost ovn_controller[154788]: 2025-11-23T09:59:35Z|00156|binding|INFO|53af77be-60e9-4895-8dfb-1340d4308442: Claiming unknown Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.128 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost NetworkManager[5975]: [1763891975.1339] manager: (tap53af77be-60): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Nov 23 04:59:35 localhost systemd-udevd[314400]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:35.138 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86837c26-d38c-4c74-814e-bcb8777abcca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=53af77be-60e9-4895-8dfb-1340d4308442) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:35.139 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 53af77be-60e9-4895-8dfb-1340d4308442 in datapath 43d7b2ef-542d-499e-b6e4-ba4caed0547d bound to our chassis#033[00m Nov 23 04:59:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:35.140 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43d7b2ef-542d-499e-b6e4-ba4caed0547d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:35 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:35.142 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[230163d0-3ebd-41b3-8ee4-fb3af32ca5a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:35 localhost ovn_controller[154788]: 2025-11-23T09:59:35Z|00157|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 ovn-installed in OVS Nov 23 04:59:35 localhost ovn_controller[154788]: 2025-11-23T09:59:35Z|00158|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 up in Southbound Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.166 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.199 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:35 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 04:59:35 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:59:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e106 e106: 6 total, 6 up, 6 in Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.788 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deleting instance files /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.789 281956 INFO nova.virt.libvirt.driver [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Deletion of /var/lib/nova/instances/8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7_del complete#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.870 281956 INFO nova.scheduler.client.report [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Deleted allocations for instance 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.914 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.915 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 04:59:35 localhost nova_compute[281952]: 2025-11-23 09:59:35.967 281956 DEBUG oslo_concurrency.processutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 04:59:36 localhost podman[314488]: Nov 23 04:59:36 localhost ovn_controller[154788]: 2025-11-23T09:59:36Z|00159|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:36 localhost podman[314488]: 2025-11-23 09:59:36.125062054 +0000 UTC m=+0.122282238 container create 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:36 localhost systemd[1]: Started libpod-conmon-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope. Nov 23 04:59:36 localhost systemd[1]: tmp-crun.t5m8YR.mount: Deactivated successfully. Nov 23 04:59:36 localhost systemd[1]: Started libcrun container. Nov 23 04:59:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd8876e7ddd7dafa11908fbfb871118f8df1d821738197ffd26d3d2606b14a04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:59:36 localhost podman[314488]: 2025-11-23 09:59:36.095654598 +0000 UTC m=+0.092874822 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:59:36 localhost podman[314488]: 2025-11-23 09:59:36.202643374 +0000 UTC m=+0.199863598 container init 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:59:36 localhost podman[314488]: 2025-11-23 09:59:36.212127566 +0000 UTC m=+0.209347780 container start 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 04:59:36 localhost dnsmasq[314524]: started, version 2.85 cachesize 150 Nov 23 04:59:36 localhost dnsmasq[314524]: DNS service limited to local subnets Nov 23 04:59:36 localhost dnsmasq[314524]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:59:36 localhost dnsmasq[314524]: warning: no upstream servers configured Nov 23 04:59:36 localhost dnsmasq-dhcp[314524]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 04:59:36 localhost dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 0 addresses Nov 23 04:59:36 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host Nov 23 04:59:36 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:36.350 263258 INFO neutron.agent.dhcp.agent [None req-5ff18459-bf3c-415c-a5fe-548681ab746d - - - - - -] DHCP configuration for ports {'79df25e1-9d21-4fd1-bbcc-556f66257d24'} is completed#033[00m Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.399 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 04:59:36 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1805846685' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.517 281956 DEBUG oslo_concurrency.processutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.524 281956 DEBUG nova.compute.provider_tree [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.540 281956 DEBUG nova.scheduler.client.report [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.559 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:36 localhost nova_compute[281952]: 2025-11-23 09:59:36.597 281956 DEBUG oslo_concurrency.lockutils [None req-7c99ddbc-c540-42c7-bb1d-cb4180e93b5d 55581f20ed8d4bd8a61a81c525ca8141 37a58b702f564a81ab5a59cf4201b4f0 - - default default] Lock "8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7" "released" by "nova.compute.manager.ComputeManager.shelve_instance..do_shelve_instance" :: held 19.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 04:59:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e107 e107: 6 total, 6 up, 6 in Nov 23 04:59:38 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:38.238 2 INFO neutron.agent.securitygroups_rpc [None req-0fc01b46-65ca-4975-99ff-e6e4d0974af8 32512604c08f4fa48e6e985a3f6cd6d1 79509bc833494f3598e01347dc55dea9 - - default default] Security group member updated ['cfab2162-6afe-48a0-9f05-cee7f160244c']#033[00m Nov 23 04:59:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:38.268 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:37Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=83c25346-d3af-4486-a725-9089387b3a54, ip_allocation=immediate, mac_address=fa:16:3e:f6:34:69, name=tempest-RoutersTest-1314517952, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:32Z, description=, dns_domain=, id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-484078650, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19673, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1021, status=ACTIVE, subnets=['d042e155-fe30-481b-9077-8053dec275b7'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:34Z, vlan_transparent=None, network_id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cfab2162-6afe-48a0-9f05-cee7f160244c'], standard_attr_id=1045, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:37Z on network 43d7b2ef-542d-499e-b6e4-ba4caed0547d#033[00m Nov 23 04:59:38 localhost dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 1 addresses Nov 23 04:59:38 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host Nov 23 04:59:38 localhost podman[314544]: 2025-11-23 09:59:38.54269937 +0000 UTC m=+0.058040259 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:38 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts Nov 23 04:59:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 04:59:39 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:39.110 263258 INFO neutron.agent.dhcp.agent [None req-6b9e32d1-9fec-4ff5-9e3b-4f5e27f68e4f - - - - - -] DHCP configuration for ports {'83c25346-d3af-4486-a725-9089387b3a54'} is completed#033[00m Nov 23 04:59:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e108 e108: 6 total, 6 up, 6 in Nov 23 04:59:40 localhost nova_compute[281952]: 2025-11-23 09:59:40.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:41 localhost nova_compute[281952]: 2025-11-23 09:59:41.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:41 localhost nova_compute[281952]: 2025-11-23 09:59:41.402 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:41.505 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:37Z, description=, device_id=ec85bb4c-cbc6-4764-81ae-e806625613bc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=83c25346-d3af-4486-a725-9089387b3a54, ip_allocation=immediate, mac_address=fa:16:3e:f6:34:69, name=tempest-RoutersTest-1314517952, network_id=43d7b2ef-542d-499e-b6e4-ba4caed0547d, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cfab2162-6afe-48a0-9f05-cee7f160244c'], standard_attr_id=1045, status=ACTIVE, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T09:59:40Z on network 43d7b2ef-542d-499e-b6e4-ba4caed0547d#033[00m Nov 23 04:59:41 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e109 e109: 6 total, 6 up, 6 in Nov 23 04:59:41 localhost dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 1 addresses Nov 23 04:59:41 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host Nov 23 04:59:41 localhost podman[314579]: 2025-11-23 09:59:41.708628478 +0000 UTC m=+0.056909074 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 04:59:41 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts Nov 23 04:59:41 localhost podman[240668]: time="2025-11-23T09:59:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 04:59:41 localhost podman[240668]: @ - - [23/Nov/2025:09:59:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 04:59:41 localhost podman[240668]: @ - - [23/Nov/2025:09:59:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19244 "" "Go-http-client/1.1" Nov 23 04:59:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:42.044 263258 INFO neutron.agent.dhcp.agent [None req-40f0cdb7-bf0d-4e20-9098-f3703e5f37e8 - - - - - -] DHCP configuration for ports {'83c25346-d3af-4486-a725-9089387b3a54'} is completed#033[00m Nov 23 04:59:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e110 e110: 6 total, 6 up, 6 in Nov 23 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 04:59:43 localhost systemd[1]: tmp-crun.HyIUQy.mount: Deactivated successfully. Nov 23 04:59:43 localhost podman[314603]: 2025-11-23 09:59:43.034032397 +0000 UTC m=+0.080020495 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 23 04:59:43 localhost podman[314602]: 2025-11-23 09:59:43.132927414 +0000 UTC m=+0.183069431 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm) Nov 23 04:59:43 localhost podman[314603]: 2025-11-23 09:59:43.153465577 +0000 UTC m=+0.199453625 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 23 04:59:43 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 04:59:43 localhost podman[314602]: 2025-11-23 09:59:43.168260152 +0000 UTC m=+0.218402079 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 04:59:43 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 04:59:43 localhost podman[314601]: 2025-11-23 09:59:43.240174468 +0000 UTC m=+0.292341947 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 23 04:59:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:43 localhost podman[314601]: 2025-11-23 09:59:43.276424035 +0000 UTC m=+0.328591554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:59:43 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 04:59:43 localhost nova_compute[281952]: 2025-11-23 09:59:43.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:43 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:43.338 2 INFO neutron.agent.securitygroups_rpc [None req-6713e12c-736e-4c48-95b8-a64782f68ffc 32512604c08f4fa48e6e985a3f6cd6d1 79509bc833494f3598e01347dc55dea9 - - default default] Security group member updated ['cfab2162-6afe-48a0-9f05-cee7f160244c']#033[00m Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.518 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.519 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 04:59:43 localhost nova_compute[281952]: 2025-11-23 09:59:43.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:43 localhost dnsmasq[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/addn_hosts - 0 addresses Nov 23 04:59:43 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/host Nov 23 04:59:43 localhost dnsmasq-dhcp[314524]: read /var/lib/neutron/dhcp/43d7b2ef-542d-499e-b6e4-ba4caed0547d/opts Nov 23 04:59:43 localhost podman[314679]: 2025-11-23 09:59:43.635862007 +0000 UTC m=+0.050560318 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 04:59:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e111 e111: 6 total, 6 up, 6 in Nov 23 04:59:43 localhost kernel: device tap53af77be-60 left promiscuous mode Nov 23 04:59:43 localhost nova_compute[281952]: 2025-11-23 09:59:43.814 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:43 localhost ovn_controller[154788]: 2025-11-23T09:59:43Z|00160|binding|INFO|Releasing lport 53af77be-60e9-4895-8dfb-1340d4308442 from this chassis (sb_readonly=0) Nov 23 04:59:43 localhost ovn_controller[154788]: 2025-11-23T09:59:43Z|00161|binding|INFO|Setting lport 53af77be-60e9-4895-8dfb-1340d4308442 down in Southbound Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.826 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43d7b2ef-542d-499e-b6e4-ba4caed0547d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86837c26-d38c-4c74-814e-bcb8777abcca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=53af77be-60e9-4895-8dfb-1340d4308442) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.828 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 53af77be-60e9-4895-8dfb-1340d4308442 in datapath 43d7b2ef-542d-499e-b6e4-ba4caed0547d unbound from our chassis#033[00m Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.830 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43d7b2ef-542d-499e-b6e4-ba4caed0547d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 04:59:43 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:43.832 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[90b19db1-4349-4782-b683-3241ab06f033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:43 localhost nova_compute[281952]: 2025-11-23 09:59:43.836 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:43 localhost nova_compute[281952]: 2025-11-23 09:59:43.838 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:44 localhost ovn_controller[154788]: 2025-11-23T09:59:44Z|00162|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:44 localhost nova_compute[281952]: 2025-11-23 09:59:44.429 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:44 localhost nova_compute[281952]: 2025-11-23 09:59:44.889 281956 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 23 04:59:44 localhost nova_compute[281952]: 2025-11-23 09:59:44.890 281956 INFO nova.compute.manager [-] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] VM Stopped (Lifecycle Event)#033[00m Nov 23 04:59:44 localhost nova_compute[281952]: 2025-11-23 09:59:44.907 281956 DEBUG nova.compute.manager [None req-afe46cb7-374a-4fa0-a5a9-fefa134b5545 - - - - - -] [instance: 8a8ddd35-7dc7-40a2-9524-7a0b8fec8ef7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 23 04:59:45 localhost ovn_controller[154788]: 2025-11-23T09:59:45Z|00163|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:45 localhost nova_compute[281952]: 2025-11-23 09:59:45.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:45 localhost dnsmasq[314524]: exiting on receipt of SIGTERM Nov 23 04:59:45 localhost podman[314717]: 2025-11-23 09:59:45.650533459 +0000 UTC m=+0.067222811 container kill 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:45 localhost systemd[1]: libpod-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope: Deactivated successfully. Nov 23 04:59:45 localhost podman[314731]: 2025-11-23 09:59:45.730956037 +0000 UTC m=+0.062653260 container died 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:59:45 localhost podman[314731]: 2025-11-23 09:59:45.768859974 +0000 UTC m=+0.100557127 container cleanup 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:59:45 localhost systemd[1]: libpod-conmon-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6.scope: Deactivated successfully. Nov 23 04:59:45 localhost podman[314732]: 2025-11-23 09:59:45.807687481 +0000 UTC m=+0.131324817 container remove 9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43d7b2ef-542d-499e-b6e4-ba4caed0547d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 04:59:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.038 263258 INFO neutron.agent.dhcp.agent [None req-b49c767d-f68a-4ced-bc43-09b6b17f7157 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.040 263258 INFO neutron.agent.dhcp.agent [None req-b49c767d-f68a-4ced-bc43-09b6b17f7157 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:46 localhost nova_compute[281952]: 2025-11-23 09:59:46.310 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:46 localhost nova_compute[281952]: 2025-11-23 09:59:46.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:46 localhost nova_compute[281952]: 2025-11-23 09:59:46.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:46.633 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:46 localhost systemd[1]: var-lib-containers-storage-overlay-fd8876e7ddd7dafa11908fbfb871118f8df1d821738197ffd26d3d2606b14a04-merged.mount: Deactivated successfully. Nov 23 04:59:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9be81573652af5630ccb68d9b7d3444c9b99241a0b2029d036d6e524730beac6-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:46 localhost systemd[1]: run-netns-qdhcp\x2d43d7b2ef\x2d542d\x2d499e\x2db6e4\x2dba4caed0547d.mount: Deactivated successfully. Nov 23 04:59:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 e112: 6 total, 6 up, 6 in Nov 23 04:59:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:48 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:48.868 263258 INFO neutron.agent.linux.ip_lib [None req-088f6986-4544-4c9b-a0ae-3cd17c55f952 - - - - - -] Device tap2f7157df-cb cannot be used as it has no MAC address#033[00m Nov 23 04:59:48 localhost nova_compute[281952]: 2025-11-23 09:59:48.889 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:48 localhost kernel: device tap2f7157df-cb entered promiscuous mode Nov 23 04:59:48 localhost ovn_controller[154788]: 2025-11-23T09:59:48Z|00164|binding|INFO|Claiming lport 2f7157df-cb90-4428-a3df-6f057985d5d6 for this chassis. Nov 23 04:59:48 localhost NetworkManager[5975]: [1763891988.8975] manager: (tap2f7157df-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Nov 23 04:59:48 localhost ovn_controller[154788]: 2025-11-23T09:59:48Z|00165|binding|INFO|2f7157df-cb90-4428-a3df-6f057985d5d6: Claiming unknown Nov 23 04:59:48 localhost nova_compute[281952]: 2025-11-23 09:59:48.897 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:48 localhost systemd-udevd[314770]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:48 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:48.911 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f7157df-cb90-4428-a3df-6f057985d5d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:48 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:48.913 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2f7157df-cb90-4428-a3df-6f057985d5d6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 04:59:48 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:48.916 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:48 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:48.917 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2662e08b-6888-4279-bf50-157caa3294a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost ovn_controller[154788]: 2025-11-23T09:59:48Z|00166|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 ovn-installed in OVS Nov 23 04:59:48 localhost ovn_controller[154788]: 2025-11-23T09:59:48Z|00167|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 up in Southbound Nov 23 04:59:48 localhost nova_compute[281952]: 2025-11-23 09:59:48.936 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost journal[230249]: ethtool ioctl error on tap2f7157df-cb: No such device Nov 23 04:59:48 localhost nova_compute[281952]: 2025-11-23 09:59:48.976 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:49 localhost nova_compute[281952]: 2025-11-23 09:59:49.003 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:49 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:49.578 2 INFO neutron.agent.securitygroups_rpc [None req-061cdcce-87b3-4fab-8b64-8613c3b5bd77 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:49 localhost podman[314841]: Nov 23 04:59:49 localhost podman[314841]: 2025-11-23 09:59:49.818170896 +0000 UTC m=+0.094208734 container create 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 04:59:49 localhost systemd[1]: Started libpod-conmon-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope. Nov 23 04:59:49 localhost podman[314841]: 2025-11-23 09:59:49.77252743 +0000 UTC m=+0.048565288 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:59:49 localhost systemd[1]: tmp-crun.dThqOZ.mount: Deactivated successfully. Nov 23 04:59:49 localhost systemd[1]: Started libcrun container. Nov 23 04:59:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef3addd800f914b793d519bed727ded0e4b934dd69c77876f8acbef5e0fac26a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:59:49 localhost podman[314841]: 2025-11-23 09:59:49.89362004 +0000 UTC m=+0.169657878 container init 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:49 localhost podman[314841]: 2025-11-23 09:59:49.903790803 +0000 UTC m=+0.179828641 container start 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 04:59:49 localhost dnsmasq[314859]: started, version 2.85 cachesize 150 Nov 23 04:59:49 localhost dnsmasq[314859]: DNS service limited to local subnets Nov 23 04:59:49 localhost dnsmasq[314859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:59:49 localhost dnsmasq[314859]: warning: no upstream servers configured Nov 23 04:59:49 localhost dnsmasq-dhcp[314859]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 04:59:49 localhost dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 04:59:49 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 04:59:49 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 04:59:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:49.958 263258 INFO neutron.agent.dhcp.agent [None req-088f6986-4544-4c9b-a0ae-3cd17c55f952 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:48Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4172c62e-88e1-41e0-99ba-63eaec4dfdc4, ip_allocation=immediate, mac_address=fa:16:3e:be:35:69, name=tempest-NetworksTestDHCPv6-1192500850, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['bac168bc-6669-4cb8-b775-3a9746bd36ef'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:47Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1114, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:49Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 04:59:50 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:50.058 263258 INFO neutron.agent.dhcp.agent [None req-ec6d5fd7-efb9-4eac-8775-4c7e8371b66c - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 04:59:50 localhost dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 04:59:50 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 04:59:50 localhost podman[314878]: 2025-11-23 09:59:50.156251021 +0000 UTC m=+0.055097799 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 04:59:50 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 04:59:50 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:50.411 263258 INFO neutron.agent.dhcp.agent [None req-ebc4ae50-499d-47a7-b809-f0574156c4cb - - - - - -] DHCP configuration for ports {'4172c62e-88e1-41e0-99ba-63eaec4dfdc4'} is completed#033[00m Nov 23 04:59:50 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:50.520 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 04:59:50 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:50.742 2 INFO neutron.agent.securitygroups_rpc [None req-5782673d-3ad2-4525-b0b7-33b67eb33956 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:51 localhost dnsmasq[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 04:59:51 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 04:59:51 localhost podman[314914]: 2025-11-23 09:59:51.306916956 +0000 UTC m=+0.059124212 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 04:59:51 localhost dnsmasq-dhcp[314859]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 04:59:51 localhost nova_compute[281952]: 2025-11-23 09:59:51.313 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:51 localhost nova_compute[281952]: 2025-11-23 09:59:51.406 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:52 localhost dnsmasq[314859]: exiting on receipt of SIGTERM Nov 23 04:59:52 localhost systemd[1]: libpod-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope: Deactivated successfully. Nov 23 04:59:52 localhost podman[314953]: 2025-11-23 09:59:52.138396109 +0000 UTC m=+0.058989258 container kill 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 04:59:52 localhost podman[314967]: 2025-11-23 09:59:52.208999764 +0000 UTC m=+0.055205201 container died 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:52 localhost podman[314967]: 2025-11-23 09:59:52.243716704 +0000 UTC m=+0.089922091 container cleanup 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:52 localhost systemd[1]: libpod-conmon-868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1.scope: Deactivated successfully. Nov 23 04:59:52 localhost podman[314969]: 2025-11-23 09:59:52.294025384 +0000 UTC m=+0.133762522 container remove 868b5681bebf40ebe6254e9eade3b5237f733764f146d0578998b06d656afdc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:52 localhost ovn_controller[154788]: 2025-11-23T09:59:52Z|00168|binding|INFO|Releasing lport 2f7157df-cb90-4428-a3df-6f057985d5d6 from this chassis (sb_readonly=0) Nov 23 04:59:52 localhost nova_compute[281952]: 2025-11-23 09:59:52.348 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:52 localhost ovn_controller[154788]: 2025-11-23T09:59:52Z|00169|binding|INFO|Setting lport 2f7157df-cb90-4428-a3df-6f057985d5d6 down in Southbound Nov 23 04:59:52 localhost kernel: device tap2f7157df-cb left promiscuous mode Nov 23 04:59:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:52.357 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f7157df-cb90-4428-a3df-6f057985d5d6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:52.359 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2f7157df-cb90-4428-a3df-6f057985d5d6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 04:59:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:52.360 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:52 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:52.361 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[643148b5-c0dc-4802-a22e-9a9f1fb61ea2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:52 localhost nova_compute[281952]: 2025-11-23 09:59:52.372 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:52 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:52.641 2 INFO neutron.agent.securitygroups_rpc [None req-633bd2af-73f5-42be-a8e1-16475aa1b324 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:53 localhost sshd[314999]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:59:53 localhost systemd[1]: var-lib-containers-storage-overlay-ef3addd800f914b793d519bed727ded0e4b934dd69c77876f8acbef5e0fac26a-merged.mount: Deactivated successfully. Nov 23 04:59:53 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 04:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 04:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 04:59:53 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:53.178 2 INFO neutron.agent.securitygroups_rpc [None req-33fb7598-4629-4242-b064-9d05bdc1e723 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:53 localhost podman[315002]: 2025-11-23 09:59:53.256694688 +0000 UTC m=+0.087652571 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 04:59:53 localhost podman[315002]: 2025-11-23 09:59:53.275235669 +0000 UTC m=+0.106193552 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 04:59:53 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 04:59:53 localhost podman[315001]: 2025-11-23 09:59:53.373456555 +0000 UTC m=+0.206956966 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible) Nov 23 04:59:53 localhost podman[315001]: 2025-11-23 09:59:53.417363508 +0000 UTC m=+0.250863939 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd) Nov 23 04:59:53 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:53.417 263258 INFO neutron.agent.linux.ip_lib [None req-cabf86b4-993a-41b4-af7e-6ccd00ca4a4e - - - - - -] Device tapddb7852b-7c cannot be used as it has no MAC address#033[00m Nov 23 04:59:53 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost kernel: device tapddb7852b-7c entered promiscuous mode Nov 23 04:59:53 localhost NetworkManager[5975]: [1763891993.4838] manager: (tapddb7852b-7c): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.484 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost ovn_controller[154788]: 2025-11-23T09:59:53Z|00170|binding|INFO|Claiming lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a for this chassis. Nov 23 04:59:53 localhost ovn_controller[154788]: 2025-11-23T09:59:53Z|00171|binding|INFO|ddb7852b-7c36-4aa5-8295-8df19c4d8b4a: Claiming unknown Nov 23 04:59:53 localhost systemd-udevd[315052]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:53 localhost ovn_controller[154788]: 2025-11-23T09:59:53Z|00172|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a ovn-installed in OVS Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost ovn_controller[154788]: 2025-11-23T09:59:53Z|00173|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 04:59:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:53.502 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ddb7852b-7c36-4aa5-8295-8df19c4d8b4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:53.505 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ddb7852b-7c36-4aa5-8295-8df19c4d8b4a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 04:59:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:53.507 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:53 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:53.508 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3de220b8-09c9-4932-a40a-a53f25ff5ace]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost ovn_controller[154788]: 2025-11-23T09:59:53Z|00174|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a up in Southbound Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.545 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.560 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:53 localhost journal[230249]: ethtool ioctl error on tapddb7852b-7c: No such device Nov 23 04:59:53 localhost nova_compute[281952]: 2025-11-23 09:59:53.590 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:54 localhost podman[315124]: Nov 23 04:59:54 localhost podman[315124]: 2025-11-23 09:59:54.414278439 +0000 UTC m=+0.089922861 container create 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 04:59:54 localhost systemd[1]: Started libpod-conmon-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope. Nov 23 04:59:54 localhost podman[315124]: 2025-11-23 09:59:54.37183137 +0000 UTC m=+0.047475812 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 04:59:54 localhost systemd[1]: Started libcrun container. Nov 23 04:59:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4637a3d824d286df4a1b1fcf765afe447a8e7356de1248847cdf31f132b25d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 04:59:54 localhost podman[315124]: 2025-11-23 09:59:54.489721603 +0000 UTC m=+0.165366025 container init 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 04:59:54 localhost podman[315124]: 2025-11-23 09:59:54.500302208 +0000 UTC m=+0.175946620 container start 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:59:54 localhost dnsmasq[315143]: started, version 2.85 cachesize 150 Nov 23 04:59:54 localhost dnsmasq[315143]: DNS service limited to local subnets Nov 23 04:59:54 localhost dnsmasq[315143]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 04:59:54 localhost dnsmasq[315143]: warning: no upstream servers configured Nov 23 04:59:54 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 04:59:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:59:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:59:54 localhost snmpd[67457]: empty variable list in _query Nov 23 04:59:54 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:54.737 263258 INFO neutron.agent.dhcp.agent [None req-b5c71915-d143-4e32-b3d1-cfe8e7c6f766 - - - - - -] DHCP configuration for ports {'f4a3c9d5-826f-4752-9fc8-6210e097bf26', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 04:59:54 localhost sshd[315169]: main: sshd: ssh-rsa algorithm is disabled Nov 23 04:59:54 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 04:59:54 localhost podman[315159]: 2025-11-23 09:59:54.871875565 +0000 UTC m=+0.063275330 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 04:59:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.008 263258 INFO neutron.agent.dhcp.agent [None req-fc438f45-a2d7-4875-80ac-5c05f6b61b09 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:52Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f4a3c9d5-826f-4752-9fc8-6210e097bf26, ip_allocation=immediate, mac_address=fa:16:3e:00:d6:91, name=tempest-NetworksTestDHCPv6-954743884, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2ac68f23-53c8-4938-b68e-b9f69a6a44c5'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:52Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1142, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:52Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 04:59:55 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 04:59:55 localhost podman[315199]: 2025-11-23 09:59:55.174768716 +0000 UTC m=+0.058615747 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 04:59:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.193 263258 INFO neutron.agent.dhcp.agent [None req-3a18f5df-3aea-4672-b539-9e53d1b559a1 - - - - - -] DHCP configuration for ports {'ddb7852b-7c36-4aa5-8295-8df19c4d8b4a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 04:59:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:55.418 263258 INFO neutron.agent.dhcp.agent [None req-c425ca3f-6b9b-418b-9451-0fa7302be87d - - - - - -] DHCP configuration for ports {'f4a3c9d5-826f-4752-9fc8-6210e097bf26'} is completed#033[00m Nov 23 04:59:55 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:55.432 2 INFO neutron.agent.securitygroups_rpc [None req-4a1b8c07-dec2-4711-92de-c07233183ccc 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:55 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 04:59:55 localhost podman[315237]: 2025-11-23 09:59:55.501727858 +0000 UTC m=+0.061657781 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 04:59:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:56.166 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T09:59:54Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=38a31dac-5d6f-422a-b220-42a6c32d9bbc, ip_allocation=immediate, mac_address=fa:16:3e:9d:c2:b1, name=tempest-NetworksTestDHCPv6-934745237, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['5d113a57-a631-4e71-848c-68cfe31da68d'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:54Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1152, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:54Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 04:59:56 localhost nova_compute[281952]: 2025-11-23 09:59:56.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:56 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 04:59:56 localhost podman[315276]: 2025-11-23 09:59:56.340834707 +0000 UTC m=+0.062282400 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 04:59:56 localhost nova_compute[281952]: 2025-11-23 09:59:56.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:56.580 263258 INFO neutron.agent.dhcp.agent [None req-8610995d-fcc2-4559-b6ae-ad54e6c521ee - - - - - -] DHCP configuration for ports {'38a31dac-5d6f-422a-b220-42a6c32d9bbc'} is completed#033[00m Nov 23 04:59:57 localhost neutron_sriov_agent[256124]: 2025-11-23 09:59:57.988 2 INFO neutron.agent.securitygroups_rpc [None req-d239cbef-5e7b-4e18-8195-2f02667a16df 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 04:59:58 localhost nova_compute[281952]: 2025-11-23 09:59:58.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:58 localhost dnsmasq[315143]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 04:59:58 localhost podman[315314]: 2025-11-23 09:59:58.189254929 +0000 UTC m=+0.064041234 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 04:59:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 04:59:58 localhost dnsmasq[315143]: exiting on receipt of SIGTERM Nov 23 04:59:58 localhost systemd[1]: tmp-crun.NJhgVy.mount: Deactivated successfully. Nov 23 04:59:58 localhost systemd[1]: libpod-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope: Deactivated successfully. Nov 23 04:59:58 localhost podman[315353]: 2025-11-23 09:59:58.836323642 +0000 UTC m=+0.069451730 container kill 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:58 localhost podman[315365]: 2025-11-23 09:59:58.913780429 +0000 UTC m=+0.065203201 container died 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 04:59:58 localhost podman[315365]: 2025-11-23 09:59:58.945815515 +0000 UTC m=+0.097238267 container cleanup 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 04:59:58 localhost systemd[1]: libpod-conmon-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533.scope: Deactivated successfully. Nov 23 04:59:59 localhost podman[315372]: 2025-11-23 09:59:59.000385437 +0000 UTC m=+0.138069655 container remove 59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00175|binding|INFO|Releasing lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a from this chassis (sb_readonly=0) Nov 23 04:59:59 localhost kernel: device tapddb7852b-7c left promiscuous mode Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00176|binding|INFO|Setting lport ddb7852b-7c36-4aa5-8295-8df19c4d8b4a down in Southbound Nov 23 04:59:59 localhost nova_compute[281952]: 2025-11-23 09:59:59.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.074 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ddb7852b-7c36-4aa5-8295-8df19c4d8b4a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.075 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ddb7852b-7c36-4aa5-8295-8df19c4d8b4a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.076 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.077 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[82370e26-4b2c-4dd4-9376-e6340f32cfdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:59 localhost nova_compute[281952]: 2025-11-23 09:59:59.082 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:59 localhost systemd[1]: var-lib-containers-storage-overlay-c4637a3d824d286df4a1b1fcf765afe447a8e7356de1248847cdf31f132b25d9-merged.mount: Deactivated successfully. Nov 23 04:59:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59b2e253a6d82067486c9cf0860138ce509dd979a1140ba262d1d18f6b7db533-userdata-shm.mount: Deactivated successfully. Nov 23 04:59:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:59.496 263258 INFO neutron.agent.dhcp.agent [None req-5d1e6f94-57c1-4b29-97f5-d5053daeb570 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 04:59:59 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 04:59:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 09:59:59.902 263258 INFO neutron.agent.linux.ip_lib [None req-168fae75-3402-4a90-b826-664c5b0144bc - - - - - -] Device tape328d85c-9f cannot be used as it has no MAC address#033[00m Nov 23 04:59:59 localhost nova_compute[281952]: 2025-11-23 09:59:59.925 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:59 localhost kernel: device tape328d85c-9f entered promiscuous mode Nov 23 04:59:59 localhost NetworkManager[5975]: [1763891999.9337] manager: (tape328d85c-9f): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00177|binding|INFO|Claiming lport e328d85c-9f4d-4a9c-9609-f789abfbba67 for this chassis. Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00178|binding|INFO|e328d85c-9f4d-4a9c-9609-f789abfbba67: Claiming unknown Nov 23 04:59:59 localhost systemd-udevd[315404]: Network interface NamePolicy= disabled on kernel command line. Nov 23 04:59:59 localhost nova_compute[281952]: 2025-11-23 09:59:59.942 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.949 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd38528c3feb64b31add54cee7508cb83', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02c6b597-012a-4c61-8f3c-0e9a58e1964d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e328d85c-9f4d-4a9c-9609-f789abfbba67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.951 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e328d85c-9f4d-4a9c-9609-f789abfbba67 in datapath 7292f404-64ea-4ef3-b81e-f698709e4eec bound to our chassis#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.953 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7292f404-64ea-4ef3-b81e-f698709e4eec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 04:59:59 localhost ovn_metadata_agent[160434]: 2025-11-23 09:59:59.954 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cb35318d-b8a3-4a87-992c-74c6a610421e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00179|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 ovn-installed in OVS Nov 23 04:59:59 localhost ovn_controller[154788]: 2025-11-23T09:59:59Z|00180|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 up in Southbound Nov 23 04:59:59 localhost nova_compute[281952]: 2025-11-23 09:59:59.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 04:59:59 localhost openstack_network_exporter[242668]: ERROR 09:59:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 04:59:59 localhost openstack_network_exporter[242668]: ERROR 09:59:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:59:59 localhost openstack_network_exporter[242668]: ERROR 09:59:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 04:59:59 localhost openstack_network_exporter[242668]: ERROR 09:59:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 04:59:59 localhost openstack_network_exporter[242668]: Nov 23 04:59:59 localhost openstack_network_exporter[242668]: ERROR 09:59:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 04:59:59 localhost openstack_network_exporter[242668]: Nov 23 04:59:59 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 05:00:00 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 05:00:00 localhost journal[230249]: ethtool ioctl error on tape328d85c-9f: No such device Nov 23 05:00:00 localhost nova_compute[281952]: 2025-11-23 10:00:00.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:00 localhost nova_compute[281952]: 2025-11-23 10:00:00.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:00 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:00.277 2 INFO neutron.agent.securitygroups_rpc [None req-07e0e453-973e-4bb8-a740-58232b03adf2 e78ebdfe612745638abad47217c77d70 a40d996843764f32a4281f01703f5aee - - default default] Security group member updated ['e81e3952-d0ad-411e-a904-c021d2ed129c']#033[00m Nov 23 05:00:00 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 05:00:00 localhost podman[315475]: Nov 23 05:00:00 localhost podman[315475]: 2025-11-23 10:00:00.852064279 +0000 UTC m=+0.084086292 container create 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:00 localhost systemd[1]: Started libpod-conmon-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope. Nov 23 05:00:00 localhost podman[315475]: 2025-11-23 10:00:00.810135277 +0000 UTC m=+0.042157290 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:00 localhost systemd[1]: Started libcrun container. Nov 23 05:00:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc1d3c6a45bfeaa6a92f58a5f600b24b7eb89e79f1a1294714d9a566d06522ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:00 localhost podman[315475]: 2025-11-23 10:00:00.942970919 +0000 UTC m=+0.174992902 container init 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:00:00 localhost podman[315475]: 2025-11-23 10:00:00.951930534 +0000 UTC m=+0.183952507 container start 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:00:00 localhost dnsmasq[315494]: started, version 2.85 cachesize 150 Nov 23 05:00:00 localhost dnsmasq[315494]: DNS service limited to local subnets Nov 23 05:00:00 localhost dnsmasq[315494]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:00 localhost dnsmasq[315494]: warning: no upstream servers configured Nov 23 05:00:00 localhost dnsmasq-dhcp[315494]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:00:00 localhost dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 0 addresses Nov 23 05:00:00 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host Nov 23 05:00:00 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts Nov 23 05:00:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:01.072 263258 INFO neutron.agent.linux.ip_lib [None req-5d7b1cd6-bebe-493a-a524-d593c0794e40 - - - - - -] Device tap52f1f0cb-3e cannot be used as it has no MAC address#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.096 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost kernel: device tap52f1f0cb-3e entered promiscuous mode Nov 23 05:00:01 localhost NetworkManager[5975]: [1763892001.1032] manager: (tap52f1f0cb-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Nov 23 05:00:01 localhost ovn_controller[154788]: 2025-11-23T10:00:01Z|00181|binding|INFO|Claiming lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c for this chassis. Nov 23 05:00:01 localhost ovn_controller[154788]: 2025-11-23T10:00:01Z|00182|binding|INFO|52f1f0cb-3ec5-4ffc-9a9d-29176c63170c: Claiming unknown Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.103 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost ovn_controller[154788]: 2025-11-23T10:00:01Z|00183|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c ovn-installed in OVS Nov 23 05:00:01 localhost ovn_controller[154788]: 2025-11-23T10:00:01Z|00184|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c up in Southbound Nov 23 05:00:01 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:01.119 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=52f1f0cb-3ec5-4ffc-9a9d-29176c63170c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.121 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:01.123 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:01 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:01.125 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:01 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:01.125 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e4953a1c-3986-478d-b6b3-ee287c275edd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.141 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:01.173 263258 INFO neutron.agent.dhcp.agent [None req-a7bd10e3-34c5-4365-8cc0-2551d9d95195 - - - - - -] DHCP configuration for ports {'8be851d6-7884-4472-9a21-8b074f4b4419'} is completed#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost nova_compute[281952]: 2025-11-23 10:00:01.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:01 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:01.668 2 INFO neutron.agent.securitygroups_rpc [None req-68228748-d1d1-4e93-958e-faf2dd4d659b 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:01 localhost podman[315558]: Nov 23 05:00:01 localhost podman[315558]: 2025-11-23 10:00:01.998723792 +0000 UTC m=+0.089783197 container create 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:00:02 localhost systemd[1]: Started libpod-conmon-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope. Nov 23 05:00:02 localhost systemd[1]: Started libcrun container. Nov 23 05:00:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e345bc7b1352ed7e306c74f9fd2b5a68f452619862709da970e829614cf1996d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:02 localhost podman[315558]: 2025-11-23 10:00:01.953522369 +0000 UTC m=+0.044581874 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:02 localhost podman[315558]: 2025-11-23 10:00:02.059254947 +0000 UTC m=+0.150314352 container init 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:00:02 localhost podman[315558]: 2025-11-23 10:00:02.0674649 +0000 UTC m=+0.158524325 container start 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:00:02 localhost dnsmasq[315576]: started, version 2.85 cachesize 150 Nov 23 05:00:02 localhost dnsmasq[315576]: DNS service limited to local subnets Nov 23 05:00:02 localhost dnsmasq[315576]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:02 localhost dnsmasq[315576]: warning: no upstream servers configured Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:00:02 localhost dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:02.107 2 INFO neutron.agent.securitygroups_rpc [None req-ba51cecc-c6ac-46c9-99a7-bae53245da97 e78ebdfe612745638abad47217c77d70 a40d996843764f32a4281f01703f5aee - - default default] Security group member updated ['e81e3952-d0ad-411e-a904-c021d2ed129c']#033[00m Nov 23 05:00:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.123 263258 INFO neutron.agent.dhcp.agent [None req-5d7b1cd6-bebe-493a-a524-d593c0794e40 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:00Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b9f1c825-1100-4db2-a98a-7d9238cb1fc7, ip_allocation=immediate, mac_address=fa:16:3e:46:c3:1d, name=tempest-NetworksTestDHCPv6-99455620, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2a51fc6e-7d45-4225-8d84-a35a19a90f22'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T09:59:59Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1189, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:01Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.144 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.225 263258 INFO neutron.agent.dhcp.agent [None req-13352e43-82c6-489f-bfa7-5e9abe7e047f - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:02 localhost podman[315593]: 2025-11-23 10:00:02.324964501 +0000 UTC m=+0.064122755 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:00:02 localhost dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:02.571 263258 INFO neutron.agent.dhcp.agent [None req-397dccf5-f60e-49c7-b245-9a8c97791b23 - - - - - -] DHCP configuration for ports {'b9f1c825-1100-4db2-a98a-7d9238cb1fc7'} is completed#033[00m Nov 23 05:00:02 localhost ovn_controller[154788]: 2025-11-23T10:00:02Z|00185|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:00:02 localhost nova_compute[281952]: 2025-11-23 10:00:02.744 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:02.745 2 INFO neutron.agent.securitygroups_rpc [None req-be23f1ad-34a8-40d2-b634-8fb333cbd4a1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:02 localhost podman[315615]: 2025-11-23 10:00:02.807652451 +0000 UTC m=+0.106887143 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:00:02 localhost podman[315615]: 2025-11-23 10:00:02.823375945 +0000 UTC m=+0.122610617 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:00:02 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:00:02 localhost dnsmasq[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:02 localhost dnsmasq-dhcp[315576]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:02 localhost podman[315649]: 2025-11-23 10:00:02.94752511 +0000 UTC m=+0.064558770 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:00:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:03 localhost podman[315688]: 2025-11-23 10:00:03.829930573 +0000 UTC m=+0.057929496 container kill 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:03 localhost dnsmasq[315576]: exiting on receipt of SIGTERM Nov 23 05:00:03 localhost systemd[1]: libpod-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope: Deactivated successfully. Nov 23 05:00:03 localhost podman[315702]: 2025-11-23 10:00:03.888723824 +0000 UTC m=+0.048135835 container died 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:00:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:03 localhost systemd[1]: var-lib-containers-storage-overlay-e345bc7b1352ed7e306c74f9fd2b5a68f452619862709da970e829614cf1996d-merged.mount: Deactivated successfully. Nov 23 05:00:03 localhost podman[315702]: 2025-11-23 10:00:03.925510787 +0000 UTC m=+0.084922748 container cleanup 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:00:03 localhost systemd[1]: libpod-conmon-4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6.scope: Deactivated successfully. Nov 23 05:00:03 localhost podman[315704]: 2025-11-23 10:00:03.970442791 +0000 UTC m=+0.121947818 container remove 4a02abd307fc4661f125e37e7fb7236172d96e1fe83b8401e96c3d7afd5038a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:00:04 localhost kernel: device tap52f1f0cb-3e left promiscuous mode Nov 23 05:00:04 localhost nova_compute[281952]: 2025-11-23 10:00:04.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:04 localhost ovn_controller[154788]: 2025-11-23T10:00:04Z|00186|binding|INFO|Releasing lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c from this chassis (sb_readonly=0) Nov 23 05:00:04 localhost ovn_controller[154788]: 2025-11-23T10:00:04Z|00187|binding|INFO|Setting lport 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c down in Southbound Nov 23 05:00:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:04.034 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=52f1f0cb-3ec5-4ffc-9a9d-29176c63170c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:04.036 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 52f1f0cb-3ec5-4ffc-9a9d-29176c63170c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:04 localhost nova_compute[281952]: 2025-11-23 10:00:04.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:04.038 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:04.039 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[490d65d2-0839-464f-9946-62df02652edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:04.306 263258 INFO neutron.agent.dhcp.agent [None req-1e2c7cbf-448b-437e-99cc-11c814803bad - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:00:04 localhost podman[315731]: 2025-11-23 10:00:04.407797694 +0000 UTC m=+0.079854232 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:00:04 localhost podman[315731]: 2025-11-23 10:00:04.422413464 +0000 UTC m=+0.094469992 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:00:04 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:00:04 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:05.740 263258 INFO neutron.agent.linux.ip_lib [None req-6377a058-1dee-49ef-a9ac-2ba6ba735af3 - - - - - -] Device tap1361841c-06 cannot be used as it has no MAC address#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost kernel: device tap1361841c-06 entered promiscuous mode Nov 23 05:00:05 localhost NetworkManager[5975]: [1763892005.7720] manager: (tap1361841c-06): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Nov 23 05:00:05 localhost ovn_controller[154788]: 2025-11-23T10:00:05Z|00188|binding|INFO|Claiming lport 1361841c-06ad-4d69-bc3f-025652255be1 for this chassis. Nov 23 05:00:05 localhost ovn_controller[154788]: 2025-11-23T10:00:05Z|00189|binding|INFO|1361841c-06ad-4d69-bc3f-025652255be1: Claiming unknown Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost systemd-udevd[315762]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:05 localhost ovn_controller[154788]: 2025-11-23T10:00:05Z|00190|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 ovn-installed in OVS Nov 23 05:00:05 localhost ovn_controller[154788]: 2025-11-23T10:00:05Z|00191|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 up in Southbound Nov 23 05:00:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:05.783 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1361841c-06ad-4d69-bc3f-025652255be1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:05.785 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1361841c-06ad-4d69-bc3f-025652255be1 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.785 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:05.786 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:05.787 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2741a8aa-8841-49c8-a4a5-960ee92e1173]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.809 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.816 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.845 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:05 localhost nova_compute[281952]: 2025-11-23 10:00:05.871 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:06 localhost nova_compute[281952]: 2025-11-23 10:00:06.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:06 localhost nova_compute[281952]: 2025-11-23 10:00:06.410 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:06 localhost podman[315818]: Nov 23 05:00:06 localhost podman[315818]: 2025-11-23 10:00:06.680265168 +0000 UTC m=+0.077048684 container create a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:06 localhost systemd[1]: Started libpod-conmon-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope. Nov 23 05:00:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:06.734 2 INFO neutron.agent.securitygroups_rpc [None req-43a1a170-ce3b-4775-9849-731eb3e4f92f 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:06 localhost systemd[1]: tmp-crun.JgwRav.mount: Deactivated successfully. Nov 23 05:00:06 localhost systemd[1]: Started libcrun container. Nov 23 05:00:06 localhost podman[315818]: 2025-11-23 10:00:06.647198069 +0000 UTC m=+0.043981625 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8e400ebade1d5745f819f7e784ec959ce2672dd5f201c317d18abdd769f4f96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:06 localhost podman[315818]: 2025-11-23 10:00:06.762203752 +0000 UTC m=+0.158987298 container init a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:00:06 localhost podman[315818]: 2025-11-23 10:00:06.773526941 +0000 UTC m=+0.170310467 container start a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:00:06 localhost dnsmasq[315837]: started, version 2.85 cachesize 150 Nov 23 05:00:06 localhost dnsmasq[315837]: DNS service limited to local subnets Nov 23 05:00:06 localhost dnsmasq[315837]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:06 localhost dnsmasq[315837]: warning: no upstream servers configured Nov 23 05:00:06 localhost dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:06.864 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:05Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=748d6a3f-f1c6-434f-b00a-7c0ef0e7aba9, ip_allocation=immediate, mac_address=fa:16:3e:a7:25:4d, name=tempest-NetworksTestDHCPv6-1735144909, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['ef90d8b7-7256-412e-b516-d1aad7604a50'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:04Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1203, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:06Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:06.913 263258 INFO neutron.agent.dhcp.agent [None req-e60af20a-8387-4929-a046-275fef5541b1 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:07 localhost dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:07 localhost podman[315854]: 2025-11-23 10:00:07.053613329 +0000 UTC m=+0.062879758 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:00:07 localhost nova_compute[281952]: 2025-11-23 10:00:07.216 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:07.280 263258 INFO neutron.agent.dhcp.agent [None req-614445ea-dfe2-427b-b0a8-794db535cf75 - - - - - -] DHCP configuration for ports {'748d6a3f-f1c6-434f-b00a-7c0ef0e7aba9'} is completed#033[00m Nov 23 05:00:08 localhost nova_compute[281952]: 2025-11-23 10:00:08.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:08.411 2 INFO neutron.agent.securitygroups_rpc [None req-c2842295-e0a9-464d-87c0-40db2e234814 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:08 localhost dnsmasq[315837]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:08 localhost podman[315894]: 2025-11-23 10:00:08.651261045 +0000 UTC m=+0.051500478 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:00:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:08.911 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:08Z, description=, device_id=6c959180-536e-4cbb-a6e5-3082c340988b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4a56a8b3-653a-448c-b174-df8ff4b0cddd, ip_allocation=immediate, mac_address=fa:16:3e:2c:29:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:56Z, description=, dns_domain=, id=7292f404-64ea-4ef3-b81e-f698709e4eec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1432555221-network, port_security_enabled=True, project_id=d38528c3feb64b31add54cee7508cb83, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62314, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1166, status=ACTIVE, subnets=['6dac9eaf-88ad-4550-8301-df230ecdba75'], tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T09:59:58Z, vlan_transparent=None, network_id=7292f404-64ea-4ef3-b81e-f698709e4eec, port_security_enabled=False, project_id=d38528c3feb64b31add54cee7508cb83, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1211, status=DOWN, tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T10:00:08Z on network 7292f404-64ea-4ef3-b81e-f698709e4eec#033[00m Nov 23 05:00:09 localhost dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 1 addresses Nov 23 05:00:09 localhost podman[315932]: 2025-11-23 10:00:09.139562587 +0000 UTC m=+0.056765159 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:00:09 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host Nov 23 05:00:09 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts Nov 23 05:00:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:00:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:00:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:09.300 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:00:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.378 263258 INFO neutron.agent.dhcp.agent [None req-976ea8c9-fe78-4fd1-a780-eaea2a2a7b47 - - - - - -] DHCP configuration for ports {'4a56a8b3-653a-448c-b174-df8ff4b0cddd'} is completed#033[00m Nov 23 05:00:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.383 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:09.841 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:08Z, description=, device_id=6c959180-536e-4cbb-a6e5-3082c340988b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4a56a8b3-653a-448c-b174-df8ff4b0cddd, ip_allocation=immediate, mac_address=fa:16:3e:2c:29:e9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:56Z, description=, dns_domain=, id=7292f404-64ea-4ef3-b81e-f698709e4eec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1432555221-network, port_security_enabled=True, project_id=d38528c3feb64b31add54cee7508cb83, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62314, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1166, status=ACTIVE, subnets=['6dac9eaf-88ad-4550-8301-df230ecdba75'], tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T09:59:58Z, vlan_transparent=None, network_id=7292f404-64ea-4ef3-b81e-f698709e4eec, port_security_enabled=False, project_id=d38528c3feb64b31add54cee7508cb83, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1211, status=DOWN, tags=[], tenant_id=d38528c3feb64b31add54cee7508cb83, updated_at=2025-11-23T10:00:08Z on network 7292f404-64ea-4ef3-b81e-f698709e4eec#033[00m Nov 23 05:00:10 localhost systemd[1]: tmp-crun.Iw6nSX.mount: Deactivated successfully. Nov 23 05:00:10 localhost dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 1 addresses Nov 23 05:00:10 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host Nov 23 05:00:10 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts Nov 23 05:00:10 localhost podman[315981]: 2025-11-23 10:00:10.128738979 +0000 UTC m=+0.066321205 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:00:10 localhost dnsmasq[315837]: exiting on receipt of SIGTERM Nov 23 05:00:10 localhost podman[315998]: 2025-11-23 10:00:10.184473506 +0000 UTC m=+0.051890130 container kill a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:00:10 localhost systemd[1]: libpod-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope: Deactivated successfully. Nov 23 05:00:10 localhost podman[316015]: 2025-11-23 10:00:10.259243259 +0000 UTC m=+0.054759568 container died a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:10.350 263258 INFO neutron.agent.dhcp.agent [None req-1470576d-d2ca-4f1e-8345-83ef73dfaf8e - - - - - -] DHCP configuration for ports {'4a56a8b3-653a-448c-b174-df8ff4b0cddd'} is completed#033[00m Nov 23 05:00:10 localhost podman[316015]: 2025-11-23 10:00:10.366831153 +0000 UTC m=+0.162347452 container remove a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:00:10 localhost systemd[1]: libpod-conmon-a50676e1296699a3ed38b6212ac5c6d4a6d2503a70b1252fed721ca96db7b1b9.scope: Deactivated successfully. Nov 23 05:00:10 localhost ovn_controller[154788]: 2025-11-23T10:00:10Z|00192|binding|INFO|Releasing lport 1361841c-06ad-4d69-bc3f-025652255be1 from this chassis (sb_readonly=0) Nov 23 05:00:10 localhost ovn_controller[154788]: 2025-11-23T10:00:10Z|00193|binding|INFO|Setting lport 1361841c-06ad-4d69-bc3f-025652255be1 down in Southbound Nov 23 05:00:10 localhost kernel: device tap1361841c-06 left promiscuous mode Nov 23 05:00:10 localhost nova_compute[281952]: 2025-11-23 10:00:10.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:10.397 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1361841c-06ad-4d69-bc3f-025652255be1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:10.398 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1361841c-06ad-4d69-bc3f-025652255be1 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:10.399 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:10.400 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc12948-2404-42e8-b030-bfb1f3359faa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:10 localhost nova_compute[281952]: 2025-11-23 10:00:10.401 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:10.728 263258 INFO neutron.agent.dhcp.agent [None req-01df078b-a456-43c1-8df4-ab8e3d09ebe5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:11 localhost systemd[1]: var-lib-containers-storage-overlay-a8e400ebade1d5745f819f7e784ec959ce2672dd5f201c317d18abdd769f4f96-merged.mount: Deactivated successfully. Nov 23 05:00:11 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.412 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:11.439 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:11 localhost podman[240668]: time="2025-11-23T10:00:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:00:11 localhost podman[240668]: @ - - [23/Nov/2025:10:00:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:00:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:11.912 263258 INFO neutron.agent.linux.ip_lib [None req-9abd4672-52f3-4ee0-bc5e-3d41ce6ff1f9 - - - - - -] Device tape1ef4443-64 cannot be used as it has no MAC address#033[00m Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.940 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost kernel: device tape1ef4443-64 entered promiscuous mode Nov 23 05:00:11 localhost ovn_controller[154788]: 2025-11-23T10:00:11Z|00194|binding|INFO|Claiming lport e1ef4443-64c5-4e20-a52c-3034596e7966 for this chassis. Nov 23 05:00:11 localhost NetworkManager[5975]: [1763892011.9488] manager: (tape1ef4443-64): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Nov 23 05:00:11 localhost ovn_controller[154788]: 2025-11-23T10:00:11Z|00195|binding|INFO|e1ef4443-64c5-4e20-a52c-3034596e7966: Claiming unknown Nov 23 05:00:11 localhost podman[240668]: @ - - [23/Nov/2025:10:00:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19238 "" "Go-http-client/1.1" Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.950 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost systemd-udevd[316058]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:11 localhost ovn_controller[154788]: 2025-11-23T10:00:11Z|00196|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 ovn-installed in OVS Nov 23 05:00:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:11.960 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e1ef4443-64c5-4e20-a52c-3034596e7966) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:11 localhost ovn_controller[154788]: 2025-11-23T10:00:11Z|00197|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 up in Southbound Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.962 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:11.964 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e1ef4443-64c5-4e20-a52c-3034596e7966 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:11.965 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:11.966 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[efe1fad7-83b9-4162-9a15-a9b4c93ca24f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:11 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:11 localhost nova_compute[281952]: 2025-11-23 10:00:11.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:11 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:11 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost journal[230249]: ethtool ioctl error on tape1ef4443-64: No such device Nov 23 05:00:12 localhost nova_compute[281952]: 2025-11-23 10:00:12.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:12.053 2 INFO neutron.agent.securitygroups_rpc [None req-311a0dd8-e6e7-491c-ad02-2876db83aabe 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:12 localhost nova_compute[281952]: 2025-11-23 10:00:12.064 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:12 localhost podman[316129]: Nov 23 05:00:12 localhost podman[316129]: 2025-11-23 10:00:12.876435323 +0000 UTC m=+0.081943745 container create 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:00:12 localhost systemd[1]: Started libpod-conmon-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope. Nov 23 05:00:12 localhost podman[316129]: 2025-11-23 10:00:12.83835188 +0000 UTC m=+0.043860302 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:12 localhost systemd[1]: Started libcrun container. Nov 23 05:00:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9db4e9f89b2d873d6f9e5ee936e95cdb8a3e819e23918b397cc465f572f0c23e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:12 localhost podman[316129]: 2025-11-23 10:00:12.957206602 +0000 UTC m=+0.162715024 container init 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:12 localhost podman[316129]: 2025-11-23 10:00:12.966307552 +0000 UTC m=+0.171815974 container start 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:12 localhost dnsmasq[316147]: started, version 2.85 cachesize 150 Nov 23 05:00:12 localhost dnsmasq[316147]: DNS service limited to local subnets Nov 23 05:00:12 localhost dnsmasq[316147]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:12 localhost dnsmasq[316147]: warning: no upstream servers configured Nov 23 05:00:12 localhost dnsmasq-dhcp[316147]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:00:12 localhost dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:12 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:12 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.025 263258 INFO neutron.agent.dhcp.agent [None req-9abd4672-52f3-4ee0-bc5e-3d41ce6ff1f9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:11Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8ff2de60-ed9d-46a1-a09b-0f3c4a9e36af, ip_allocation=immediate, mac_address=fa:16:3e:d5:50:83, name=tempest-NetworksTestDHCPv6-1256092217, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['02326955-b246-4007-9c83-62554b760510'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:10Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1232, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:11Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:13 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:13.109 2 INFO neutron.agent.securitygroups_rpc [None req-e5debdc8-a6b2-4239-b0b7-2d251ec66c55 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.124 263258 INFO neutron.agent.dhcp.agent [None req-79dc8a8d-cdb6-4c30-8c66-e0cb306034b7 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:13 localhost dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:13 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:13 localhost podman[316166]: 2025-11-23 10:00:13.203274402 +0000 UTC m=+0.064112146 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:13 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:13 localhost nova_compute[281952]: 2025-11-23 10:00:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:13 localhost nova_compute[281952]: 2025-11-23 10:00:13.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:13.441 263258 INFO neutron.agent.dhcp.agent [None req-b4e79595-aeb3-4100-becc-38a590576024 - - - - - -] DHCP configuration for ports {'8ff2de60-ed9d-46a1-a09b-0f3c4a9e36af'} is completed#033[00m Nov 23 05:00:13 localhost dnsmasq[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:13 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:13 localhost dnsmasq-dhcp[316147]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:13 localhost podman[316204]: 2025-11-23 10:00:13.509768143 +0000 UTC m=+0.048671690 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:00:13 localhost podman[316224]: 2025-11-23 10:00:13.78526571 +0000 UTC m=+0.087767465 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 05:00:13 localhost podman[316226]: 2025-11-23 10:00:13.850381026 +0000 UTC m=+0.148940580 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 05:00:13 localhost podman[316226]: 2025-11-23 10:00:13.862235151 +0000 UTC m=+0.160794695 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc.) Nov 23 05:00:13 localhost podman[316224]: 2025-11-23 10:00:13.872724595 +0000 UTC m=+0.175226390 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 23 05:00:13 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:00:13 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:00:13 localhost podman[316225]: 2025-11-23 10:00:13.762247501 +0000 UTC m=+0.066462179 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:00:13 localhost podman[316225]: 2025-11-23 10:00:13.945168966 +0000 UTC m=+0.249383674 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:13 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:00:14 localhost nova_compute[281952]: 2025-11-23 10:00:14.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:14 localhost nova_compute[281952]: 2025-11-23 10:00:14.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:00:14 localhost dnsmasq[316147]: exiting on receipt of SIGTERM Nov 23 05:00:14 localhost podman[316302]: 2025-11-23 10:00:14.233737435 +0000 UTC m=+0.061556887 container kill 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:00:14 localhost systemd[1]: libpod-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope: Deactivated successfully. Nov 23 05:00:14 localhost nova_compute[281952]: 2025-11-23 10:00:14.297 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m Nov 23 05:00:14 localhost podman[316316]: 2025-11-23 10:00:14.304752153 +0000 UTC m=+0.056377847 container died 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:00:14 localhost systemd[1]: tmp-crun.9af0m0.mount: Deactivated successfully. Nov 23 05:00:14 localhost podman[316316]: 2025-11-23 10:00:14.339947707 +0000 UTC m=+0.091573371 container cleanup 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:00:14 localhost systemd[1]: libpod-conmon-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522.scope: Deactivated successfully. Nov 23 05:00:14 localhost podman[316318]: 2025-11-23 10:00:14.391556487 +0000 UTC m=+0.135227246 container remove 326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:14 localhost nova_compute[281952]: 2025-11-23 10:00:14.436 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:14 localhost ovn_controller[154788]: 2025-11-23T10:00:14Z|00198|binding|INFO|Releasing lport e1ef4443-64c5-4e20-a52c-3034596e7966 from this chassis (sb_readonly=0) Nov 23 05:00:14 localhost ovn_controller[154788]: 2025-11-23T10:00:14Z|00199|binding|INFO|Setting lport e1ef4443-64c5-4e20-a52c-3034596e7966 down in Southbound Nov 23 05:00:14 localhost kernel: device tape1ef4443-64 left promiscuous mode Nov 23 05:00:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:14.448 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e1ef4443-64c5-4e20-a52c-3034596e7966) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:14.450 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e1ef4443-64c5-4e20-a52c-3034596e7966 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:14.452 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:14.453 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[74025d70-b89d-4382-8ebd-7c68163d87b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:14 localhost nova_compute[281952]: 2025-11-23 10:00:14.461 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:14.756 263258 INFO neutron.agent.dhcp.agent [None req-3ba73744-a7bb-4a1f-9f69-b6a0b3c24a92 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:14 localhost dnsmasq[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/addn_hosts - 0 addresses Nov 23 05:00:14 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/host Nov 23 05:00:14 localhost podman[316364]: 2025-11-23 10:00:14.875769274 +0000 UTC m=+0.067338166 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:14 localhost dnsmasq-dhcp[315494]: read /var/lib/neutron/dhcp/7292f404-64ea-4ef3-b81e-f698709e4eec/opts Nov 23 05:00:14 localhost systemd[1]: var-lib-containers-storage-overlay-9db4e9f89b2d873d6f9e5ee936e95cdb8a3e819e23918b397cc465f572f0c23e-merged.mount: Deactivated successfully. Nov 23 05:00:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-326e027202e2b323bebe6fcb1db7c02c92ea696198145e3af84231eaa2445522-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:14 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:15 localhost nova_compute[281952]: 2025-11-23 10:00:15.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:15 localhost nova_compute[281952]: 2025-11-23 10:00:15.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:15 localhost nova_compute[281952]: 2025-11-23 10:00:15.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:00:15 localhost nova_compute[281952]: 2025-11-23 10:00:15.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:15 localhost ovn_controller[154788]: 2025-11-23T10:00:15Z|00200|binding|INFO|Releasing lport e328d85c-9f4d-4a9c-9609-f789abfbba67 from this chassis (sb_readonly=0) Nov 23 05:00:15 localhost kernel: device tape328d85c-9f left promiscuous mode Nov 23 05:00:15 localhost ovn_controller[154788]: 2025-11-23T10:00:15Z|00201|binding|INFO|Setting lport e328d85c-9f4d-4a9c-9609-f789abfbba67 down in Southbound Nov 23 05:00:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:15.566 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7292f404-64ea-4ef3-b81e-f698709e4eec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd38528c3feb64b31add54cee7508cb83', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02c6b597-012a-4c61-8f3c-0e9a58e1964d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e328d85c-9f4d-4a9c-9609-f789abfbba67) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:15.568 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e328d85c-9f4d-4a9c-9609-f789abfbba67 in datapath 7292f404-64ea-4ef3-b81e-f698709e4eec unbound from our chassis#033[00m Nov 23 05:00:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:15.571 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7292f404-64ea-4ef3-b81e-f698709e4eec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:00:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:15.572 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[15329bf2-b781-4e7c-a538-7507f500eabb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:15 localhost nova_compute[281952]: 2025-11-23 10:00:15.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.391 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:16.502 263258 INFO neutron.agent.linux.ip_lib [None req-42d2d7f7-1adb-4eaa-9dcb-87d671731a21 - - - - - -] Device tap95b29c7c-cf cannot be used as it has no MAC address#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.528 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost kernel: device tap95b29c7c-cf entered promiscuous mode Nov 23 05:00:16 localhost NetworkManager[5975]: [1763892016.5362] manager: (tap95b29c7c-cf): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.537 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost ovn_controller[154788]: 2025-11-23T10:00:16Z|00202|binding|INFO|Claiming lport 95b29c7c-cf48-409a-9165-0c602a1e631c for this chassis. Nov 23 05:00:16 localhost ovn_controller[154788]: 2025-11-23T10:00:16Z|00203|binding|INFO|95b29c7c-cf48-409a-9165-0c602a1e631c: Claiming unknown Nov 23 05:00:16 localhost systemd-udevd[316417]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:16.546 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=95b29c7c-cf48-409a-9165-0c602a1e631c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:16 localhost ovn_controller[154788]: 2025-11-23T10:00:16Z|00204|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c ovn-installed in OVS Nov 23 05:00:16 localhost ovn_controller[154788]: 2025-11-23T10:00:16Z|00205|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c up in Southbound Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:16.550 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 95b29c7c-cf48-409a-9165-0c602a1e631c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:16.551 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:16.552 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3fcb38-1b8c-44bd-939c-e0e0163eeb29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost journal[230249]: ethtool ioctl error on tap95b29c7c-cf: No such device Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.606 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.615 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:16 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:00:16 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1277874800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.694 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.777 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:00:16 localhost nova_compute[281952]: 2025-11-23 10:00:16.778 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:00:16 localhost sshd[316458]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.027 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.029 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11204MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.030 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.095 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.134 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:00:17 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:17.160 2 INFO neutron.agent.securitygroups_rpc [None req-eec5559f-af9d-40c0-b1ad-486b576202fe 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:17 localhost podman[316513]: Nov 23 05:00:17 localhost podman[316513]: 2025-11-23 10:00:17.421493995 +0000 UTC m=+0.092437288 container create ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:00:17 localhost systemd[1]: Started libpod-conmon-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope. Nov 23 05:00:17 localhost systemd[1]: Started libcrun container. Nov 23 05:00:17 localhost podman[316513]: 2025-11-23 10:00:17.375349924 +0000 UTC m=+0.046293287 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34fe63b57edfb9e1b264cb2cb286bcf45ccef5cee20574124af5a76e14a96835/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:17 localhost podman[316513]: 2025-11-23 10:00:17.487486868 +0000 UTC m=+0.158430171 container init ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:17 localhost podman[316513]: 2025-11-23 10:00:17.49629143 +0000 UTC m=+0.167234733 container start ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:00:17 localhost dnsmasq[316531]: started, version 2.85 cachesize 150 Nov 23 05:00:17 localhost dnsmasq[316531]: DNS service limited to local subnets Nov 23 05:00:17 localhost dnsmasq[316531]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:17 localhost dnsmasq[316531]: warning: no upstream servers configured Nov 23 05:00:17 localhost dnsmasq-dhcp[316531]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:00:17 localhost dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:17 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:17 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:17.559 263258 INFO neutron.agent.dhcp.agent [None req-42d2d7f7-1adb-4eaa-9dcb-87d671731a21 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:16Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=dc44aaa3-710b-4f8f-a729-299c89225fa7, ip_allocation=immediate, mac_address=fa:16:3e:9c:57:f5, name=tempest-NetworksTestDHCPv6-1200518540, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['479278d7-0738-4be2-9ede-7f789b35986a'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:14Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1249, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:16Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:00:17 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/458760552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.582 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.586 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.603 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.625 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:00:17 localhost nova_compute[281952]: 2025-11-23 10:00:17.625 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:00:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:17.626 263258 INFO neutron.agent.dhcp.agent [None req-d6c0039d-cf8b-4514-ab61-ee8f413fb5e8 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:17 localhost podman[316551]: 2025-11-23 10:00:17.72128411 +0000 UTC m=+0.039846148 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:00:17 localhost dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:17 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:17 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:18 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:18.212 263258 INFO neutron.agent.dhcp.agent [None req-8a1c29c8-3381-4308-a43f-4e71fce393e7 - - - - - -] DHCP configuration for ports {'dc44aaa3-710b-4f8f-a729-299c89225fa7'} is completed#033[00m Nov 23 05:00:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:20 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:20.060 2 INFO neutron.agent.securitygroups_rpc [None req-0b580d26-bcb7-428b-9b48-acb40f408d24 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:20 localhost dnsmasq[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:20 localhost podman[316588]: 2025-11-23 10:00:20.25103484 +0000 UTC m=+0.046637367 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:00:20 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:20 localhost dnsmasq-dhcp[316531]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:20 localhost nova_compute[281952]: 2025-11-23 10:00:20.624 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:20 localhost nova_compute[281952]: 2025-11-23 10:00:20.625 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:00:21 localhost nova_compute[281952]: 2025-11-23 10:00:21.393 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:21 localhost nova_compute[281952]: 2025-11-23 10:00:21.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:21 localhost ovn_controller[154788]: 2025-11-23T10:00:21Z|00206|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:21 localhost podman[316625]: 2025-11-23 10:00:21.483585768 +0000 UTC m=+0.063201497 container kill ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:21 localhost dnsmasq[316531]: exiting on receipt of SIGTERM Nov 23 05:00:21 localhost systemd[1]: libpod-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope: Deactivated successfully. Nov 23 05:00:21 localhost nova_compute[281952]: 2025-11-23 10:00:21.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:21 localhost podman[316640]: 2025-11-23 10:00:21.547559289 +0000 UTC m=+0.044941015 container died ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:00:21 localhost systemd[1]: tmp-crun.A74vZz.mount: Deactivated successfully. Nov 23 05:00:21 localhost podman[316640]: 2025-11-23 10:00:21.602360038 +0000 UTC m=+0.099741704 container remove ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:00:21 localhost systemd[1]: libpod-conmon-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d.scope: Deactivated successfully. Nov 23 05:00:21 localhost nova_compute[281952]: 2025-11-23 10:00:21.617 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:21 localhost kernel: device tap95b29c7c-cf left promiscuous mode Nov 23 05:00:21 localhost ovn_controller[154788]: 2025-11-23T10:00:21Z|00207|binding|INFO|Releasing lport 95b29c7c-cf48-409a-9165-0c602a1e631c from this chassis (sb_readonly=0) Nov 23 05:00:21 localhost ovn_controller[154788]: 2025-11-23T10:00:21Z|00208|binding|INFO|Setting lport 95b29c7c-cf48-409a-9165-0c602a1e631c down in Southbound Nov 23 05:00:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:21.639 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=95b29c7c-cf48-409a-9165-0c602a1e631c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:21 localhost nova_compute[281952]: 2025-11-23 10:00:21.640 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:21.642 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 95b29c7c-cf48-409a-9165-0c602a1e631c in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:21.643 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:21.644 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[94039926-b881-455c-a027-10e0be496add]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:21.946 263258 INFO neutron.agent.dhcp.agent [None req-4e4a752d-f6ad-424f-8706-c4fa15a972ea - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:22 localhost ovn_controller[154788]: 2025-11-23T10:00:22Z|00209|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:22 localhost nova_compute[281952]: 2025-11-23 10:00:22.120 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:22 localhost dnsmasq[315494]: exiting on receipt of SIGTERM Nov 23 05:00:22 localhost podman[316684]: 2025-11-23 10:00:22.317318122 +0000 UTC m=+0.058906825 container kill 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:00:22 localhost systemd[1]: libpod-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope: Deactivated successfully. Nov 23 05:00:22 localhost podman[316698]: 2025-11-23 10:00:22.38994865 +0000 UTC m=+0.059272777 container died 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:00:22 localhost podman[316698]: 2025-11-23 10:00:22.421649706 +0000 UTC m=+0.090973783 container cleanup 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:00:22 localhost systemd[1]: libpod-conmon-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537.scope: Deactivated successfully. Nov 23 05:00:22 localhost podman[316705]: 2025-11-23 10:00:22.477048193 +0000 UTC m=+0.133242646 container remove 74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7292f404-64ea-4ef3-b81e-f698709e4eec, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:22 localhost systemd[1]: var-lib-containers-storage-overlay-34fe63b57edfb9e1b264cb2cb286bcf45ccef5cee20574124af5a76e14a96835-merged.mount: Deactivated successfully. Nov 23 05:00:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee6206865db82c17dccb2bab107c343f069952adb157d823b4e3069b32d4135d-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:22 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:22 localhost systemd[1]: var-lib-containers-storage-overlay-dc1d3c6a45bfeaa6a92f58a5f600b24b7eb89e79f1a1294714d9a566d06522ea-merged.mount: Deactivated successfully. Nov 23 05:00:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74d41848ec7d92fa37c08a2d081e1da293a23618137d8918da560875b423e537-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:23 localhost systemd[1]: run-netns-qdhcp\x2d7292f404\x2d64ea\x2d4ef3\x2db81e\x2df698709e4eec.mount: Deactivated successfully. Nov 23 05:00:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.014 263258 INFO neutron.agent.dhcp.agent [None req-f3721c64-3324-4463-b4c0-bae26c902645 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.015 263258 INFO neutron.agent.dhcp.agent [None req-f3721c64-3324-4463-b4c0-bae26c902645 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.055 263258 INFO neutron.agent.linux.ip_lib [None req-2b84a56f-eea8-4921-9446-c7ce5ebcdec5 - - - - - -] Device tap5f699be7-b1 cannot be used as it has no MAC address#033[00m Nov 23 05:00:23 localhost nova_compute[281952]: 2025-11-23 10:00:23.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:23 localhost kernel: device tap5f699be7-b1 entered promiscuous mode Nov 23 05:00:23 localhost NetworkManager[5975]: [1763892023.0851] manager: (tap5f699be7-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Nov 23 05:00:23 localhost systemd-udevd[316736]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:23 localhost ovn_controller[154788]: 2025-11-23T10:00:23Z|00210|binding|INFO|Claiming lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f for this chassis. Nov 23 05:00:23 localhost ovn_controller[154788]: 2025-11-23T10:00:23Z|00211|binding|INFO|5f699be7-b1a4-4b0a-9b05-aa2b5159827f: Claiming unknown Nov 23 05:00:23 localhost nova_compute[281952]: 2025-11-23 10:00:23.089 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:23.116 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5f699be7-b1a4-4b0a-9b05-aa2b5159827f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:23.117 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 5f699be7-b1a4-4b0a-9b05-aa2b5159827f in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:23.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:23.119 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cb3c0f35-ca88-453a-8bc3-cc2bdfccb15c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost ovn_controller[154788]: 2025-11-23T10:00:23Z|00212|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f ovn-installed in OVS Nov 23 05:00:23 localhost ovn_controller[154788]: 2025-11-23T10:00:23Z|00213|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f up in Southbound Nov 23 05:00:23 localhost nova_compute[281952]: 2025-11-23 10:00:23.135 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost journal[230249]: ethtool ioctl error on tap5f699be7-b1: No such device Nov 23 05:00:23 localhost nova_compute[281952]: 2025-11-23 10:00:23.171 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:23 localhost nova_compute[281952]: 2025-11-23 10:00:23.194 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:23.472 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:00:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:00:24 localhost podman[316808]: 2025-11-23 10:00:24.044961793 +0000 UTC m=+0.095884385 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2) Nov 23 05:00:24 localhost nova_compute[281952]: 2025-11-23 10:00:24.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:24 localhost podman[316808]: 2025-11-23 10:00:24.083317785 +0000 UTC m=+0.134240337 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:00:24 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:00:24 localhost podman[316807]: Nov 23 05:00:24 localhost podman[316807]: 2025-11-23 10:00:24.114768854 +0000 UTC m=+0.173928069 container create 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:24 localhost podman[316814]: 2025-11-23 10:00:24.138521225 +0000 UTC m=+0.186705423 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:00:24 localhost podman[316814]: 2025-11-23 10:00:24.145210292 +0000 UTC m=+0.193394490 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:00:24 localhost podman[316807]: 2025-11-23 10:00:24.051737461 +0000 UTC m=+0.110896676 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:24 localhost systemd[1]: Started libpod-conmon-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope. Nov 23 05:00:24 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:00:24 localhost systemd[1]: Started libcrun container. Nov 23 05:00:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/038226a67fa7d74c36919df9cd9edb6225722632a91f4877537f142348f28939/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:24 localhost podman[316807]: 2025-11-23 10:00:24.19259137 +0000 UTC m=+0.251750595 container init 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:24 localhost podman[316807]: 2025-11-23 10:00:24.204033383 +0000 UTC m=+0.263192648 container start 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:24 localhost dnsmasq[316868]: started, version 2.85 cachesize 150 Nov 23 05:00:24 localhost dnsmasq[316868]: DNS service limited to local subnets Nov 23 05:00:24 localhost dnsmasq[316868]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:24 localhost dnsmasq[316868]: warning: no upstream servers configured Nov 23 05:00:24 localhost dnsmasq-dhcp[316868]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:00:24 localhost dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:24 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:24 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:24.402 263258 INFO neutron.agent.dhcp.agent [None req-e4604f50-da11-412e-9d24-d5397ebfd058 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:24 localhost dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:24 localhost podman[316887]: 2025-11-23 10:00:24.53667651 +0000 UTC m=+0.059242946 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:00:24 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:24 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:24.837 263258 INFO neutron.agent.dhcp.agent [None req-f455726f-5d44-445b-81f1-fb5d4bd8cce1 - - - - - -] DHCP configuration for ports {'5f699be7-b1a4-4b0a-9b05-aa2b5159827f', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:25 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:25.376 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:25Z, description=, device_id=146cb74b-0e80-472c-b385-97b8554397ad, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b1a93d84-9bcf-438f-92ff-bbd336b9d0db, ip_allocation=immediate, mac_address=fa:16:3e:b2:74:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['53ae3a3e-05c5-40f2-93cb-3cdbe8c6e451'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:23Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:25Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:25 localhost dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:25 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:25 localhost systemd[1]: tmp-crun.62q2SZ.mount: Deactivated successfully. Nov 23 05:00:25 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:25 localhost podman[316927]: 2025-11-23 10:00:25.556923681 +0000 UTC m=+0.048905618 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:00:25 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:25.805 263258 INFO neutron.agent.dhcp.agent [None req-14b67171-52bf-45e3-8b04-4010502dfe11 - - - - - -] DHCP configuration for ports {'b1a93d84-9bcf-438f-92ff-bbd336b9d0db'} is completed#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.397 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.518 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:25Z, description=, device_id=146cb74b-0e80-472c-b385-97b8554397ad, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b1a93d84-9bcf-438f-92ff-bbd336b9d0db, ip_allocation=immediate, mac_address=fa:16:3e:b2:74:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['53ae3a3e-05c5-40f2-93cb-3cdbe8c6e451'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:23Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:25Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.631 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.655 263258 INFO neutron.agent.linux.ip_lib [None req-76d4543b-402d-4d69-89ce-e1dc8ae635c4 - - - - - -] Device tap10e1f965-56 cannot be used as it has no MAC address#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost kernel: device tap10e1f965-56 entered promiscuous mode Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost NetworkManager[5975]: [1763892026.6867] manager: (tap10e1f965-56): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Nov 23 05:00:26 localhost ovn_controller[154788]: 2025-11-23T10:00:26Z|00214|binding|INFO|Claiming lport 10e1f965-5681-42dc-916c-83e697ea474c for this chassis. Nov 23 05:00:26 localhost ovn_controller[154788]: 2025-11-23T10:00:26Z|00215|binding|INFO|10e1f965-5681-42dc-916c-83e697ea474c: Claiming unknown Nov 23 05:00:26 localhost systemd-udevd[316986]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:26.706 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10754024-8e92-4669-8b3d-be0210470d0a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10e1f965-5681-42dc-916c-83e697ea474c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:26.709 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 10e1f965-5681-42dc-916c-83e697ea474c in datapath 30192eb7-6210-4b4d-956f-dbc64d7c0b7c bound to our chassis#033[00m Nov 23 05:00:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:26.710 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:26.711 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a802c4-b012-46ac-a240-73181f698402]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:26 localhost systemd[1]: tmp-crun.12YjOm.mount: Deactivated successfully. Nov 23 05:00:26 localhost podman[316971]: 2025-11-23 10:00:26.728634816 +0000 UTC m=+0.082001988 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:26 localhost dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:26 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:26 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:26 localhost ovn_controller[154788]: 2025-11-23T10:00:26Z|00216|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c ovn-installed in OVS Nov 23 05:00:26 localhost ovn_controller[154788]: 2025-11-23T10:00:26Z|00217|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c up in Southbound Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.803 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost nova_compute[281952]: 2025-11-23 10:00:26.926 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:26.947 263258 INFO neutron.agent.dhcp.agent [None req-1d6e31aa-1ed7-411e-8e51-3408088f4bd1 - - - - - -] DHCP configuration for ports {'b1a93d84-9bcf-438f-92ff-bbd336b9d0db'} is completed#033[00m Nov 23 05:00:27 localhost podman[317050]: Nov 23 05:00:27 localhost podman[317050]: 2025-11-23 10:00:27.677836937 +0000 UTC m=+0.094113600 container create b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:00:27 localhost systemd[1]: Started libpod-conmon-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope. Nov 23 05:00:27 localhost podman[317050]: 2025-11-23 10:00:27.631768338 +0000 UTC m=+0.048045071 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:27 localhost systemd[1]: Started libcrun container. Nov 23 05:00:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0701b612469235a6b7433f4e52ad0b8dcaf36306964dddada708c650d1295ed6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:27 localhost podman[317050]: 2025-11-23 10:00:27.751788325 +0000 UTC m=+0.168064998 container init b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:00:27 localhost podman[317050]: 2025-11-23 10:00:27.765975072 +0000 UTC m=+0.182251745 container start b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:00:27 localhost dnsmasq[317068]: started, version 2.85 cachesize 150 Nov 23 05:00:27 localhost dnsmasq[317068]: DNS service limited to local subnets Nov 23 05:00:27 localhost dnsmasq[317068]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:27 localhost dnsmasq[317068]: warning: no upstream servers configured Nov 23 05:00:27 localhost dnsmasq-dhcp[317068]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:00:27 localhost dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 0 addresses Nov 23 05:00:27 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host Nov 23 05:00:27 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts Nov 23 05:00:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:27.895 263258 INFO neutron.agent.dhcp.agent [None req-5460cacc-060a-4e29-b443-a26ca8b60dac - - - - - -] DHCP configuration for ports {'89a49f8b-be13-4044-a6f3-e04e5a35b524'} is completed#033[00m Nov 23 05:00:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:28 localhost systemd[1]: tmp-crun.kFOWr3.mount: Deactivated successfully. Nov 23 05:00:29 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:29.000 2 INFO neutron.agent.securitygroups_rpc [None req-7e066fee-86c7-4254-86ea-1e6408304d23 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']#033[00m Nov 23 05:00:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:29.176 263258 INFO neutron.agent.linux.ip_lib [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Device tap40554683-ca cannot be used as it has no MAC address#033[00m Nov 23 05:00:29 localhost ovn_controller[154788]: 2025-11-23T10:00:29Z|00218|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.243 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost kernel: device tap40554683-ca entered promiscuous mode Nov 23 05:00:29 localhost NetworkManager[5975]: [1763892029.2525] manager: (tap40554683-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Nov 23 05:00:29 localhost ovn_controller[154788]: 2025-11-23T10:00:29Z|00219|binding|INFO|Claiming lport 40554683-cae1-4516-8207-1a403f64812d for this chassis. Nov 23 05:00:29 localhost systemd-udevd[316988]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:29 localhost ovn_controller[154788]: 2025-11-23T10:00:29Z|00220|binding|INFO|40554683-cae1-4516-8207-1a403f64812d: Claiming unknown Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:29.273 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb02ad4f92a44de895fb1e96459d4a8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=646e4187-302b-4a7e-8182-fdc259bed00e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40554683-cae1-4516-8207-1a403f64812d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:29.276 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 40554683-cae1-4516-8207-1a403f64812d in datapath de8825f6-e682-4aa7-96e0-b4e3b7280d13 bound to our chassis#033[00m Nov 23 05:00:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:29.277 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network de8825f6-e682-4aa7-96e0-b4e3b7280d13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:29.278 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1ca5f5-c3cd-4be3-bd67-a47a76813bef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost ovn_controller[154788]: 2025-11-23T10:00:29Z|00221|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d ovn-installed in OVS Nov 23 05:00:29 localhost ovn_controller[154788]: 2025-11-23T10:00:29Z|00222|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d up in Southbound Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost nova_compute[281952]: 2025-11-23 10:00:29.355 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:29.694 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ff81b8b-de5b-48b0-b2f2-303ad670977c, ip_allocation=immediate, mac_address=fa:16:3e:81:2c:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:24Z, description=, dns_domain=, id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-585029312, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4326, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1285, status=ACTIVE, subnets=['6432ddf9-5778-4055-86ee-5aa10ffd470f'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:25Z, vlan_transparent=None, network_id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1330, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:29Z on network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c#033[00m Nov 23 05:00:29 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:29.800 2 INFO neutron.agent.securitygroups_rpc [None req-28d507eb-defd-436d-9561-198db6b19aa5 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']#033[00m Nov 23 05:00:29 localhost dnsmasq[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:29 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:29 localhost dnsmasq-dhcp[316868]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:29 localhost podman[317125]: 2025-11-23 10:00:29.807718418 +0000 UTC m=+0.064254271 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:00:29 localhost systemd[1]: tmp-crun.INd9hk.mount: Deactivated successfully. Nov 23 05:00:29 localhost dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 1 addresses Nov 23 05:00:29 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host Nov 23 05:00:29 localhost podman[317161]: 2025-11-23 10:00:29.958472402 +0000 UTC m=+0.071800663 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:29 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts Nov 23 05:00:29 localhost openstack_network_exporter[242668]: ERROR 10:00:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:00:29 localhost openstack_network_exporter[242668]: ERROR 10:00:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:00:29 localhost openstack_network_exporter[242668]: ERROR 10:00:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:00:29 localhost openstack_network_exporter[242668]: ERROR 10:00:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:00:29 localhost openstack_network_exporter[242668]: Nov 23 05:00:29 localhost openstack_network_exporter[242668]: ERROR 10:00:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:00:29 localhost openstack_network_exporter[242668]: Nov 23 05:00:30 localhost podman[317205]: Nov 23 05:00:30 localhost podman[317205]: 2025-11-23 10:00:30.16131862 +0000 UTC m=+0.085437482 container create d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.165 263258 INFO neutron.agent.dhcp.agent [None req-e9c3f30c-1390-4658-9425-0ac2a8c7f35b - - - - - -] DHCP configuration for ports {'1ff81b8b-de5b-48b0-b2f2-303ad670977c'} is completed#033[00m Nov 23 05:00:30 localhost systemd[1]: Started libpod-conmon-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope. Nov 23 05:00:30 localhost systemd[1]: Started libcrun container. Nov 23 05:00:30 localhost podman[317205]: 2025-11-23 10:00:30.116621184 +0000 UTC m=+0.040740076 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c90e5066c28a5f27a8a00727318e8ddaad558b249ed18b61f33f5561371dec20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:30 localhost podman[317205]: 2025-11-23 10:00:30.22784263 +0000 UTC m=+0.151961472 container init d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:00:30 localhost podman[317205]: 2025-11-23 10:00:30.237520618 +0000 UTC m=+0.161639450 container start d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:30 localhost dnsmasq[317225]: started, version 2.85 cachesize 150 Nov 23 05:00:30 localhost dnsmasq[317225]: DNS service limited to local subnets Nov 23 05:00:30 localhost dnsmasq[317225]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:30 localhost dnsmasq[317225]: warning: no upstream servers configured Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:00:30 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 0 addresses Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.300 263258 INFO neutron.agent.dhcp.agent [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=540485dd-5323-420d-bb5e-137503a34ac6, ip_allocation=immediate, mac_address=fa:16:3e:67:34:8c, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1268709717, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:26Z, description=, dns_domain=, id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1098943766, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1297, status=ACTIVE, subnets=['ea3934cc-3b39-4b5f-922b-bfb1b349d5c3'], tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:27Z, vlan_transparent=None, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:28Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.405 263258 INFO neutron.agent.dhcp.agent [None req-8fd16439-bd5b-4cd7-952a-2d9208a9f1c0 - - - - - -] DHCP configuration for ports {'9b03d173-5707-4af9-badd-14643f8c8076'} is completed#033[00m Nov 23 05:00:30 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:30 localhost podman[317244]: 2025-11-23 10:00:30.480493703 +0000 UTC m=+0.078780208 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.599 263258 INFO neutron.agent.dhcp.agent [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:29Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=c210641c-12c4-499a-9e67-1f37df4d3cc6, ip_allocation=immediate, mac_address=fa:16:3e:30:12:10, name=tempest-ExtraDHCPOptionsIpV6TestJSON-2007863999, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:26Z, description=, dns_domain=, id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1098943766, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=6425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1297, status=ACTIVE, subnets=['ea3934cc-3b39-4b5f-922b-bfb1b349d5c3'], tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:27Z, vlan_transparent=None, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1333, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:29Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.616 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.616 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.617 263258 INFO neutron.agent.linux.dhcp [None req-32b62cb9-a873-4c75-8a5b-2155ea477ade - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.646 263258 INFO neutron.agent.dhcp.agent [None req-a2a271ba-36e4-45ab-9c06-5efc2e7ada79 - - - - - -] DHCP configuration for ports {'540485dd-5323-420d-bb5e-137503a34ac6'} is completed#033[00m Nov 23 05:00:30 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 2 addresses Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:30 localhost podman[317280]: 2025-11-23 10:00:30.772647962 +0000 UTC m=+0.059319638 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:00:30 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:30 localhost ovn_controller[154788]: 2025-11-23T10:00:30Z|00223|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:30 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:30.867 2 INFO neutron.agent.securitygroups_rpc [None req-d4d902e4-381c-4584-8a85-837df381e7eb e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']#033[00m Nov 23 05:00:30 localhost nova_compute[281952]: 2025-11-23 10:00:30.879 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:30.966 263258 INFO neutron.agent.dhcp.agent [None req-861641be-c57a-48f6-9827-eaad0158157f - - - - - -] DHCP configuration for ports {'c210641c-12c4-499a-9e67-1f37df4d3cc6'} is completed#033[00m Nov 23 05:00:31 localhost dnsmasq[316868]: exiting on receipt of SIGTERM Nov 23 05:00:31 localhost podman[317320]: 2025-11-23 10:00:31.012224973 +0000 UTC m=+0.047561196 container kill 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:31 localhost systemd[1]: libpod-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope: Deactivated successfully. Nov 23 05:00:31 localhost podman[317347]: 2025-11-23 10:00:31.074856912 +0000 UTC m=+0.046481262 container died 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:00:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:31 localhost systemd[1]: var-lib-containers-storage-overlay-038226a67fa7d74c36919df9cd9edb6225722632a91f4877537f142348f28939-merged.mount: Deactivated successfully. Nov 23 05:00:31 localhost podman[317347]: 2025-11-23 10:00:31.115478264 +0000 UTC m=+0.087102614 container remove 5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:00:31 localhost ovn_controller[154788]: 2025-11-23T10:00:31Z|00224|binding|INFO|Releasing lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f from this chassis (sb_readonly=0) Nov 23 05:00:31 localhost ovn_controller[154788]: 2025-11-23T10:00:31Z|00225|binding|INFO|Setting lport 5f699be7-b1a4-4b0a-9b05-aa2b5159827f down in Southbound Nov 23 05:00:31 localhost kernel: device tap5f699be7-b1 left promiscuous mode Nov 23 05:00:31 localhost nova_compute[281952]: 2025-11-23 10:00:31.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:31 localhost systemd[1]: libpod-conmon-5d2d888a55537e918c2c9f3637bc4bf4698f0c1af82c652ca78671663c0bf339.scope: Deactivated successfully. Nov 23 05:00:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:31.147 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5f699be7-b1a4-4b0a-9b05-aa2b5159827f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:31.149 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 5f699be7-b1a4-4b0a-9b05-aa2b5159827f in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:31 localhost nova_compute[281952]: 2025-11-23 10:00:31.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:31.151 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:31.152 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6e0631-a481-46d3-a6fc-a01bf2847ce3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:31 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses Nov 23 05:00:31 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:31 localhost podman[317369]: 2025-11-23 10:00:31.18416833 +0000 UTC m=+0.099420813 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:31 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:31 localhost nova_compute[281952]: 2025-11-23 10:00:31.400 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:31 localhost nova_compute[281952]: 2025-11-23 10:00:31.416 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.426 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=540485dd-5323-420d-bb5e-137503a34ac6, ip_allocation=immediate, mac_address=fa:16:3e:67:34:8c, name=tempest-new-port-name-1833756636, network_id=de8825f6-e682-4aa7-96e0-b4e3b7280d13, port_security_enabled=True, project_id=cb02ad4f92a44de895fb1e96459d4a8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['19c917c4-9016-402d-a3b3-6d8ae59f7b74'], standard_attr_id=1328, status=DOWN, tags=[], tenant_id=cb02ad4f92a44de895fb1e96459d4a8d, updated_at=2025-11-23T10:00:31Z on network de8825f6-e682-4aa7-96e0-b4e3b7280d13#033[00m Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.443 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.443 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.444 263258 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Nov 23 05:00:31 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 1 addresses Nov 23 05:00:31 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:31 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:31 localhost podman[317413]: 2025-11-23 10:00:31.610566645 +0000 UTC m=+0.056849362 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.722 263258 INFO neutron.agent.dhcp.agent [None req-5fc02e9a-e7d0-4fcc-8829-1fe7178b80d3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:31.801 263258 INFO neutron.agent.dhcp.agent [None req-8ccb48da-010f-42b8-a029-20f96fd88d86 - - - - - -] DHCP configuration for ports {'540485dd-5323-420d-bb5e-137503a34ac6'} is completed#033[00m Nov 23 05:00:31 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:00:33 localhost podman[317433]: 2025-11-23 10:00:33.016930659 +0000 UTC m=+0.072251016 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Nov 23 05:00:33 localhost podman[317433]: 2025-11-23 10:00:33.031475668 +0000 UTC m=+0.086796105 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3) Nov 23 05:00:33 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:00:33 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:33.119 2 INFO neutron.agent.securitygroups_rpc [None req-ddca7662-4ee1-4886-8e69-87187e050157 e02e3a6650d04cc49a5785719f724cce cb02ad4f92a44de895fb1e96459d4a8d - - default default] Security group member updated ['19c917c4-9016-402d-a3b3-6d8ae59f7b74']#033[00m Nov 23 05:00:33 localhost nova_compute[281952]: 2025-11-23 10:00:33.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:33 localhost dnsmasq[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/addn_hosts - 0 addresses Nov 23 05:00:33 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/host Nov 23 05:00:33 localhost dnsmasq-dhcp[317225]: read /var/lib/neutron/dhcp/de8825f6-e682-4aa7-96e0-b4e3b7280d13/opts Nov 23 05:00:33 localhost podman[317468]: 2025-11-23 10:00:33.342995295 +0000 UTC m=+0.048272718 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:00:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:33.516 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:28Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ff81b8b-de5b-48b0-b2f2-303ad670977c, ip_allocation=immediate, mac_address=fa:16:3e:81:2c:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:24Z, description=, dns_domain=, id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-585029312, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4326, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1285, status=ACTIVE, subnets=['6432ddf9-5778-4055-86ee-5aa10ffd470f'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:25Z, vlan_transparent=None, network_id=30192eb7-6210-4b4d-956f-dbc64d7c0b7c, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1330, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:29Z on network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c#033[00m Nov 23 05:00:33 localhost podman[317506]: 2025-11-23 10:00:33.757135092 +0000 UTC m=+0.061401662 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:00:33 localhost dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 1 addresses Nov 23 05:00:33 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host Nov 23 05:00:33 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts Nov 23 05:00:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.005 263258 INFO neutron.agent.dhcp.agent [None req-ff2d3634-b698-4615-8b78-141045b1502b - - - - - -] DHCP configuration for ports {'1ff81b8b-de5b-48b0-b2f2-303ad670977c'} is completed#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.136 263258 INFO neutron.agent.linux.ip_lib [None req-2bfd3498-e812-4d4c-858f-0433cc608f09 - - - - - -] Device tap41e3cb72-6d cannot be used as it has no MAC address#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost kernel: device tap41e3cb72-6d entered promiscuous mode Nov 23 05:00:34 localhost NetworkManager[5975]: [1763892034.1657] manager: (tap41e3cb72-6d): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00226|binding|INFO|Claiming lport 41e3cb72-6db6-4670-834e-e198cab9488d for this chassis. Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00227|binding|INFO|41e3cb72-6db6-4670-834e-e198cab9488d: Claiming unknown Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost systemd-udevd[317538]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00228|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d up in Southbound Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00229|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d ovn-installed in OVS Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.174 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.174 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41e3cb72-6db6-4670-834e-e198cab9488d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.176 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 41e3cb72-6db6-4670-834e-e198cab9488d in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.178 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.180 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[efbdee0e-35b4-4b54-8fc5-f29bf4061ca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.182 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.218 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost podman[317568]: 2025-11-23 10:00:34.470246859 +0000 UTC m=+0.053272422 container kill d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:00:34 localhost dnsmasq[317225]: exiting on receipt of SIGTERM Nov 23 05:00:34 localhost systemd[1]: libpod-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope: Deactivated successfully. Nov 23 05:00:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:00:34 localhost podman[317584]: 2025-11-23 10:00:34.53811428 +0000 UTC m=+0.053162439 container died d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:00:34 localhost podman[317593]: 2025-11-23 10:00:34.592881167 +0000 UTC m=+0.090542080 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:00:34 localhost podman[317593]: 2025-11-23 10:00:34.604310429 +0000 UTC m=+0.101971372 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:00:34 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:00:34 localhost podman[317584]: 2025-11-23 10:00:34.621494649 +0000 UTC m=+0.136542788 container cleanup d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:34 localhost systemd[1]: libpod-conmon-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883.scope: Deactivated successfully. Nov 23 05:00:34 localhost podman[317586]: 2025-11-23 10:00:34.670140178 +0000 UTC m=+0.177937223 container remove d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de8825f6-e682-4aa7-96e0-b4e3b7280d13, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00230|binding|INFO|Releasing lport 40554683-cae1-4516-8207-1a403f64812d from this chassis (sb_readonly=0) Nov 23 05:00:34 localhost ovn_controller[154788]: 2025-11-23T10:00:34Z|00231|binding|INFO|Setting lport 40554683-cae1-4516-8207-1a403f64812d down in Southbound Nov 23 05:00:34 localhost kernel: device tap40554683-ca left promiscuous mode Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.681 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.689 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de8825f6-e682-4aa7-96e0-b4e3b7280d13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb02ad4f92a44de895fb1e96459d4a8d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=646e4187-302b-4a7e-8182-fdc259bed00e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40554683-cae1-4516-8207-1a403f64812d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.690 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 40554683-cae1-4516-8207-1a403f64812d in datapath de8825f6-e682-4aa7-96e0-b4e3b7280d13 unbound from our chassis#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.691 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network de8825f6-e682-4aa7-96e0-b4e3b7280d13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:34.692 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4abf6be1-73b5-4385-a0c2-d90eca52a02e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:34 localhost nova_compute[281952]: 2025-11-23 10:00:34.702 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:34.915 263258 INFO neutron.agent.dhcp.agent [None req-22c76af9-6271-43a7-bbdf-9fee24ca917d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:35 localhost systemd[1]: var-lib-containers-storage-overlay-c90e5066c28a5f27a8a00727318e8ddaad558b249ed18b61f33f5561371dec20-merged.mount: Deactivated successfully. Nov 23 05:00:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d057d9eb35207dbbde354ecba73e313578d5ad41bb47073f511b13d902a0c883-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:35 localhost systemd[1]: run-netns-qdhcp\x2dde8825f6\x2de682\x2d4aa7\x2d96e0\x2db4e3b7280d13.mount: Deactivated successfully. Nov 23 05:00:35 localhost podman[317671]: Nov 23 05:00:35 localhost podman[317671]: 2025-11-23 10:00:35.082102398 +0000 UTC m=+0.076241640 container create 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:00:35 localhost systemd[1]: Started libpod-conmon-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope. Nov 23 05:00:35 localhost systemd[1]: Started libcrun container. Nov 23 05:00:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2c0ddcbf3ea6dc0d7c80eec26c782e2f344a162226e3650f4c2b5cc4fa9ba34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.148 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:35 localhost podman[317671]: 2025-11-23 10:00:35.149823774 +0000 UTC m=+0.143963116 container init 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:35 localhost podman[317671]: 2025-11-23 10:00:35.053801746 +0000 UTC m=+0.047940978 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:35 localhost podman[317671]: 2025-11-23 10:00:35.160432491 +0000 UTC m=+0.154571733 container start 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:00:35 localhost dnsmasq[317689]: started, version 2.85 cachesize 150 Nov 23 05:00:35 localhost dnsmasq[317689]: DNS service limited to local subnets Nov 23 05:00:35 localhost dnsmasq[317689]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:35 localhost dnsmasq[317689]: warning: no upstream servers configured Nov 23 05:00:35 localhost dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.369 263258 INFO neutron.agent.dhcp.agent [None req-ddb9fd59-5e2e-4a2a-a2d2-c9374d4d8974 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.550 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:34Z, description=, device_id=5835bf8d-282b-4320-8c00-799baca6538f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=034be82f-816e-4b81-b415-e36e7e90dc5c, ip_allocation=immediate, mac_address=fa:16:3e:e9:53:86, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['b2e5987d-a159-4cfc-adef-6fac29f80001'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:32Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1359, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:34Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:35 localhost dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:35 localhost podman[317709]: 2025-11-23 10:00:35.713995024 +0000 UTC m=+0.056301136 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:35.928 263258 INFO neutron.agent.dhcp.agent [None req-ceca03b5-dec5-4220-9670-62a5f6da3a5b - - - - - -] DHCP configuration for ports {'034be82f-816e-4b81-b415-e36e7e90dc5c'} is completed#033[00m Nov 23 05:00:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:36.192 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:36 localhost nova_compute[281952]: 2025-11-23 10:00:36.402 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:36 localhost nova_compute[281952]: 2025-11-23 10:00:36.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:00:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:00:36 localhost ovn_controller[154788]: 2025-11-23T10:00:36Z|00232|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:36 localhost nova_compute[281952]: 2025-11-23 10:00:36.866 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:36.890 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:34Z, description=, device_id=5835bf8d-282b-4320-8c00-799baca6538f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=034be82f-816e-4b81-b415-e36e7e90dc5c, ip_allocation=immediate, mac_address=fa:16:3e:e9:53:86, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['b2e5987d-a159-4cfc-adef-6fac29f80001'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:32Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=False, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1359, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:34Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:37 localhost dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:00:37 localhost podman[317837]: 2025-11-23 10:00:37.099173475 +0000 UTC m=+0.067721438 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:00:37 localhost nova_compute[281952]: 2025-11-23 10:00:37.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:37.376 263258 INFO neutron.agent.dhcp.agent [None req-03a3e59d-8c52-4a5a-8577-139971d06a05 - - - - - -] DHCP configuration for ports {'034be82f-816e-4b81-b415-e36e7e90dc5c'} is completed#033[00m Nov 23 05:00:37 localhost sshd[317858]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:00:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:38 localhost nova_compute[281952]: 2025-11-23 10:00:38.603 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:38.881 263258 INFO neutron.agent.linux.ip_lib [None req-d9cb8e3a-f670-49c1-82c8-8f0ecfbbdd8b - - - - - -] Device tapb74e35ad-94 cannot be used as it has no MAC address#033[00m Nov 23 05:00:38 localhost nova_compute[281952]: 2025-11-23 10:00:38.904 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:38 localhost kernel: device tapb74e35ad-94 entered promiscuous mode Nov 23 05:00:38 localhost NetworkManager[5975]: [1763892038.9138] manager: (tapb74e35ad-94): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Nov 23 05:00:38 localhost systemd-udevd[317870]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:38 localhost ovn_controller[154788]: 2025-11-23T10:00:38Z|00233|binding|INFO|Claiming lport b74e35ad-94a2-4d4d-af80-3b2024099e6d for this chassis. Nov 23 05:00:38 localhost ovn_controller[154788]: 2025-11-23T10:00:38Z|00234|binding|INFO|b74e35ad-94a2-4d4d-af80-3b2024099e6d: Claiming unknown Nov 23 05:00:38 localhost nova_compute[281952]: 2025-11-23 10:00:38.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:38 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:38.936 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a45cfb38-a270-4929-a93b-8d89273d60d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b74e35ad-94a2-4d4d-af80-3b2024099e6d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:38 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:38.938 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b74e35ad-94a2-4d4d-af80-3b2024099e6d in datapath bcc66174-371f-4faf-83f1-5de56d4886ad bound to our chassis#033[00m Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:38.940 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bcc66174-371f-4faf-83f1-5de56d4886ad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:38 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:38.941 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea07802-45dd-4794-8f20-cb35bee0b15c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost ovn_controller[154788]: 2025-11-23T10:00:38Z|00235|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d ovn-installed in OVS Nov 23 05:00:38 localhost ovn_controller[154788]: 2025-11-23T10:00:38Z|00236|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d up in Southbound Nov 23 05:00:38 localhost nova_compute[281952]: 2025-11-23 10:00:38.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost journal[230249]: ethtool ioctl error on tapb74e35ad-94: No such device Nov 23 05:00:38 localhost nova_compute[281952]: 2025-11-23 10:00:38.981 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:39 localhost nova_compute[281952]: 2025-11-23 10:00:39.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:39 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:00:40 localhost podman[317941]: Nov 23 05:00:40 localhost podman[317941]: 2025-11-23 10:00:40.507995745 +0000 UTC m=+0.102615611 container create c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:40 localhost podman[317941]: 2025-11-23 10:00:40.457549071 +0000 UTC m=+0.052168957 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:40 localhost systemd[1]: Started libpod-conmon-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope. Nov 23 05:00:40 localhost systemd[1]: Started libcrun container. Nov 23 05:00:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c21502b2347b802ca27076444fd406a39385c92b799f42a0977ebddef8e75100/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:40 localhost podman[317941]: 2025-11-23 10:00:40.605640303 +0000 UTC m=+0.200260159 container init c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:00:40 localhost podman[317941]: 2025-11-23 10:00:40.615442816 +0000 UTC m=+0.210062672 container start c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:00:40 localhost dnsmasq[317960]: started, version 2.85 cachesize 150 Nov 23 05:00:40 localhost dnsmasq[317960]: DNS service limited to local subnets Nov 23 05:00:40 localhost dnsmasq[317960]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:40 localhost dnsmasq[317960]: warning: no upstream servers configured Nov 23 05:00:40 localhost dnsmasq-dhcp[317960]: DHCP, static leases only on 10.101.0.0, lease time 1d Nov 23 05:00:40 localhost dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 0 addresses Nov 23 05:00:40 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host Nov 23 05:00:40 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts Nov 23 05:00:40 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:40.740 263258 INFO neutron.agent.dhcp.agent [None req-d83f04ea-9b4d-4a4d-851b-9e329226e750 - - - - - -] DHCP configuration for ports {'e5dc106a-af41-420f-ba83-3fdcb4306cd4'} is completed#033[00m Nov 23 05:00:40 localhost nova_compute[281952]: 2025-11-23 10:00:40.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:41 localhost nova_compute[281952]: 2025-11-23 10:00:41.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:41 localhost nova_compute[281952]: 2025-11-23 10:00:41.420 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:41 localhost dnsmasq[317689]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:41 localhost podman[317980]: 2025-11-23 10:00:41.860451949 +0000 UTC m=+0.069597955 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:41 localhost systemd[1]: tmp-crun.lC5SNP.mount: Deactivated successfully. Nov 23 05:00:41 localhost podman[240668]: time="2025-11-23T10:00:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:00:41 localhost podman[240668]: @ - - [23/Nov/2025:10:00:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159228 "" "Go-http-client/1.1" Nov 23 05:00:41 localhost podman[240668]: @ - - [23/Nov/2025:10:00:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20175 "" "Go-http-client/1.1" Nov 23 05:00:42 localhost nova_compute[281952]: 2025-11-23 10:00:42.034 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:42 localhost kernel: device tap41e3cb72-6d left promiscuous mode Nov 23 05:00:42 localhost ovn_controller[154788]: 2025-11-23T10:00:42Z|00237|binding|INFO|Releasing lport 41e3cb72-6db6-4670-834e-e198cab9488d from this chassis (sb_readonly=0) Nov 23 05:00:42 localhost ovn_controller[154788]: 2025-11-23T10:00:42Z|00238|binding|INFO|Setting lport 41e3cb72-6db6-4670-834e-e198cab9488d down in Southbound Nov 23 05:00:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:42.042 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41e3cb72-6db6-4670-834e-e198cab9488d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:42.044 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 41e3cb72-6db6-4670-834e-e198cab9488d in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:00:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:42.046 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:00:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:42.047 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3ab9f0-1102-4e44-8df5-922ee22a89ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:42 localhost nova_compute[281952]: 2025-11-23 10:00:42.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:43 localhost podman[318018]: 2025-11-23 10:00:43.197001602 +0000 UTC m=+0.063832576 container kill 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:00:43 localhost dnsmasq[317689]: exiting on receipt of SIGTERM Nov 23 05:00:43 localhost systemd[1]: libpod-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope: Deactivated successfully. Nov 23 05:00:43 localhost ovn_controller[154788]: 2025-11-23T10:00:43Z|00239|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:43 localhost nova_compute[281952]: 2025-11-23 10:00:43.249 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:43 localhost podman[318031]: 2025-11-23 10:00:43.262276364 +0000 UTC m=+0.046380591 container died 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:43 localhost podman[318031]: 2025-11-23 10:00:43.300400617 +0000 UTC m=+0.084504814 container cleanup 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:43 localhost systemd[1]: libpod-conmon-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11.scope: Deactivated successfully. Nov 23 05:00:43 localhost podman[318032]: 2025-11-23 10:00:43.334538809 +0000 UTC m=+0.118049577 container remove 02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:00:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:43.592 263258 INFO neutron.agent.dhcp.agent [None req-f2c9f3b1-fd81-47d9-b1b6-70801fc4861a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:00:44 localhost podman[318061]: 2025-11-23 10:00:44.02633656 +0000 UTC m=+0.079144919 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 23 05:00:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:00:44 localhost podman[318060]: 2025-11-23 10:00:44.075731672 +0000 UTC m=+0.131373488 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller) Nov 23 05:00:44 localhost podman[318061]: 2025-11-23 10:00:44.092661114 +0000 UTC m=+0.145469403 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=) Nov 23 05:00:44 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:00:44 localhost podman[318060]: 2025-11-23 10:00:44.112650969 +0000 UTC m=+0.168292785 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:44 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:00:44 localhost podman[318090]: 2025-11-23 10:00:44.183348547 +0000 UTC m=+0.121373550 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 23 05:00:44 localhost systemd[1]: var-lib-containers-storage-overlay-c2c0ddcbf3ea6dc0d7c80eec26c782e2f344a162226e3650f4c2b5cc4fa9ba34-merged.mount: Deactivated successfully. Nov 23 05:00:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02288463e6efd2f5df41d069b47a24d817352dfe3afff57f725e7d2330865e11-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:44 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:00:44 localhost podman[318090]: 2025-11-23 10:00:44.215219699 +0000 UTC m=+0.153244632 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:00:44 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:00:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:44.342 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:43Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=07c04277-2dd6-4b10-b1b3-e46f9ab20c05, ip_allocation=immediate, mac_address=fa:16:3e:b4:4e:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:34Z, description=, dns_domain=, id=bcc66174-371f-4faf-83f1-5de56d4886ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-343519683, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1360, status=ACTIVE, subnets=['213562cd-cd75-4e9d-9432-8141a3732f62'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:37Z, vlan_transparent=None, network_id=bcc66174-371f-4faf-83f1-5de56d4886ad, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1381, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:43Z on network bcc66174-371f-4faf-83f1-5de56d4886ad#033[00m Nov 23 05:00:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:44.543 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:44 localhost nova_compute[281952]: 2025-11-23 10:00:44.545 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:44.548 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:00:44 localhost podman[318139]: 2025-11-23 10:00:44.570272017 +0000 UTC m=+0.063937331 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:00:44 localhost dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 1 addresses Nov 23 05:00:44 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host Nov 23 05:00:44 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts Nov 23 05:00:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:44.868 263258 INFO neutron.agent.dhcp.agent [None req-d4a5137d-ea46-4eef-808e-261b19d9605e - - - - - -] DHCP configuration for ports {'07c04277-2dd6-4b10-b1b3-e46f9ab20c05'} is completed#033[00m Nov 23 05:00:46 localhost nova_compute[281952]: 2025-11-23 10:00:46.407 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:46 localhost nova_compute[281952]: 2025-11-23 10:00:46.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:48 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:48.924 263258 INFO neutron.agent.linux.ip_lib [None req-21080b15-717a-47ba-ab2a-5b553674fdd1 - - - - - -] Device tap01290de6-b9 cannot be used as it has no MAC address#033[00m Nov 23 05:00:48 localhost nova_compute[281952]: 2025-11-23 10:00:48.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:48 localhost kernel: device tap01290de6-b9 entered promiscuous mode Nov 23 05:00:48 localhost nova_compute[281952]: 2025-11-23 10:00:48.958 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:48 localhost NetworkManager[5975]: [1763892048.9601] manager: (tap01290de6-b9): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Nov 23 05:00:48 localhost ovn_controller[154788]: 2025-11-23T10:00:48Z|00240|binding|INFO|Claiming lport 01290de6-b98c-45d9-85a6-538e887d4c81 for this chassis. Nov 23 05:00:48 localhost ovn_controller[154788]: 2025-11-23T10:00:48Z|00241|binding|INFO|01290de6-b98c-45d9-85a6-538e887d4c81: Claiming unknown Nov 23 05:00:48 localhost systemd-udevd[318169]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:00:48 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:48.987 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefd:15c6/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=01290de6-b98c-45d9-85a6-538e887d4c81) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:48.989 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 01290de6-b98c-45d9-85a6-538e887d4c81 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:00:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:00:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:00:48 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:48.992 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2576aa57-14cf-4474-8e2f-981528ffe519]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:48 localhost ovn_controller[154788]: 2025-11-23T10:00:48Z|00242|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 ovn-installed in OVS Nov 23 05:00:48 localhost ovn_controller[154788]: 2025-11-23T10:00:48Z|00243|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 up in Southbound Nov 23 05:00:48 localhost nova_compute[281952]: 2025-11-23 10:00:48.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost journal[230249]: ethtool ioctl error on tap01290de6-b9: No such device Nov 23 05:00:49 localhost nova_compute[281952]: 2025-11-23 10:00:49.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:49 localhost nova_compute[281952]: 2025-11-23 10:00:49.060 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:49.628 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:43Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=07c04277-2dd6-4b10-b1b3-e46f9ab20c05, ip_allocation=immediate, mac_address=fa:16:3e:b4:4e:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:00:34Z, description=, dns_domain=, id=bcc66174-371f-4faf-83f1-5de56d4886ad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-343519683, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45780, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1360, status=ACTIVE, subnets=['213562cd-cd75-4e9d-9432-8141a3732f62'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:37Z, vlan_transparent=None, network_id=bcc66174-371f-4faf-83f1-5de56d4886ad, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1381, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:00:43Z on network bcc66174-371f-4faf-83f1-5de56d4886ad#033[00m Nov 23 05:00:49 localhost podman[318254]: Nov 23 05:00:49 localhost dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 1 addresses Nov 23 05:00:49 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host Nov 23 05:00:49 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts Nov 23 05:00:49 localhost podman[318268]: 2025-11-23 10:00:49.870755161 +0000 UTC m=+0.040159288 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:49 localhost podman[318254]: 2025-11-23 10:00:49.916632574 +0000 UTC m=+0.131531343 container create 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:49 localhost podman[318254]: 2025-11-23 10:00:49.821582986 +0000 UTC m=+0.036481805 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:49 localhost systemd[1]: Started libpod-conmon-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope. Nov 23 05:00:49 localhost systemd[1]: tmp-crun.h2eQgA.mount: Deactivated successfully. Nov 23 05:00:49 localhost systemd[1]: Started libcrun container. Nov 23 05:00:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb7fa617f1cf5bcb988fc404e487efa2ea75be483366f6c07ef0e60e4a6f49f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:49 localhost podman[318254]: 2025-11-23 10:00:49.987766415 +0000 UTC m=+0.202665144 container init 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:00:49 localhost podman[318254]: 2025-11-23 10:00:49.994003418 +0000 UTC m=+0.208902187 container start 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:49 localhost dnsmasq[318298]: started, version 2.85 cachesize 150 Nov 23 05:00:49 localhost dnsmasq[318298]: DNS service limited to local subnets Nov 23 05:00:49 localhost dnsmasq[318298]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:49 localhost dnsmasq[318298]: warning: no upstream servers configured Nov 23 05:00:49 localhost dnsmasq[318298]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:50 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:50.194 263258 INFO neutron.agent.dhcp.agent [None req-2bc5c5c7-ab5b-44db-bb53-ffb7ce7a4219 - - - - - -] DHCP configuration for ports {'07c04277-2dd6-4b10-b1b3-e46f9ab20c05', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:50 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:50.311 2 INFO neutron.agent.securitygroups_rpc [None req-76f35429-371d-4a0a-a261-a7949dd94068 e59892284e454ae28c30542a06194f67 7d06d32932c14944b00061256a49a5ca - - default default] Security group member updated ['3d66d90b-639c-4111-b259-a5454103aaa3']#033[00m Nov 23 05:00:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:50.550 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:00:50 localhost dnsmasq[318298]: exiting on receipt of SIGTERM Nov 23 05:00:50 localhost podman[318316]: 2025-11-23 10:00:50.714626697 +0000 UTC m=+0.059926298 container kill 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:50 localhost systemd[1]: libpod-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope: Deactivated successfully. Nov 23 05:00:50 localhost ovn_controller[154788]: 2025-11-23T10:00:50Z|00244|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:00:50 localhost nova_compute[281952]: 2025-11-23 10:00:50.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:50 localhost podman[318329]: 2025-11-23 10:00:50.77350088 +0000 UTC m=+0.047168664 container died 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:00:50 localhost podman[318329]: 2025-11-23 10:00:50.813506582 +0000 UTC m=+0.087174316 container cleanup 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:00:50 localhost systemd[1]: libpod-conmon-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a.scope: Deactivated successfully. Nov 23 05:00:50 localhost podman[318331]: 2025-11-23 10:00:50.866378331 +0000 UTC m=+0.130634915 container remove 40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:00:50 localhost systemd[1]: var-lib-containers-storage-overlay-fb7fa617f1cf5bcb988fc404e487efa2ea75be483366f6c07ef0e60e4a6f49f6-merged.mount: Deactivated successfully. Nov 23 05:00:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40bb0c42dd547e87235077b5450ed4a3a347400a9fe6dd170719d2645e54495a-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:51 localhost nova_compute[281952]: 2025-11-23 10:00:51.411 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:51 localhost nova_compute[281952]: 2025-11-23 10:00:51.422 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:52 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:52.353 2 INFO neutron.agent.securitygroups_rpc [None req-aaa2c11d-b613-412d-8872-49d855ed78d3 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:00:53 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:53.033 2 INFO neutron.agent.securitygroups_rpc [None req-6f916793-e69f-491a-861e-c8f5876d7582 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:00:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:53.090 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:00:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:53.092 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:00:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:53.095 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:00:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:53.095 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:00:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:00:53.097 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6bb57a-6faa-4340-9c3b-6b084e1b4860]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:00:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:53 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:53.691 2 INFO neutron.agent.securitygroups_rpc [None req-8312fbc9-ef14-45ec-8246-3e5b81a28890 e59892284e454ae28c30542a06194f67 7d06d32932c14944b00061256a49a5ca - - default default] Security group member updated ['3d66d90b-639c-4111-b259-a5454103aaa3']#033[00m Nov 23 05:00:54 localhost podman[318409]: Nov 23 05:00:54 localhost podman[318409]: 2025-11-23 10:00:54.061983483 +0000 UTC m=+0.090678145 container create e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:54 localhost systemd[1]: Started libpod-conmon-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope. Nov 23 05:00:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:00:54 localhost systemd[1]: tmp-crun.NB9xr6.mount: Deactivated successfully. Nov 23 05:00:54 localhost systemd[1]: Started libcrun container. Nov 23 05:00:54 localhost podman[318409]: 2025-11-23 10:00:54.021002451 +0000 UTC m=+0.049697113 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3db0c81f1e124d9d69f888333884ab63f725fc79d437184bc44f72fbf237162/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:54 localhost podman[318409]: 2025-11-23 10:00:54.130147013 +0000 UTC m=+0.158841675 container init e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:00:54 localhost podman[318409]: 2025-11-23 10:00:54.142883316 +0000 UTC m=+0.171577948 container start e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:00:54 localhost dnsmasq[318438]: started, version 2.85 cachesize 150 Nov 23 05:00:54 localhost dnsmasq[318438]: DNS service limited to local subnets Nov 23 05:00:54 localhost dnsmasq[318438]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:54 localhost dnsmasq[318438]: warning: no upstream servers configured Nov 23 05:00:54 localhost dnsmasq-dhcp[318438]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:00:54 localhost dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:54 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:54 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:54 localhost podman[318425]: 2025-11-23 10:00:54.228441171 +0000 UTC m=+0.117267994 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:00:54 localhost podman[318425]: 2025-11-23 10:00:54.245316581 +0000 UTC m=+0.134143424 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true) Nov 23 05:00:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:00:54 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:00:54 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:54.295 263258 INFO neutron.agent.dhcp.agent [None req-60f7553b-97b6-4301-adf4-1115fb7be616 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:54 localhost podman[318447]: 2025-11-23 10:00:54.338008836 +0000 UTC m=+0.068682506 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:00:54 localhost podman[318447]: 2025-11-23 10:00:54.348607223 +0000 UTC m=+0.079280903 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:00:54 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:00:54 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:54.443 2 INFO neutron.agent.securitygroups_rpc [None req-0abb4626-8029-4616-a7f3-bc7ec334c676 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:54 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:54.497 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:00:53Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=70e0899a-4b27-45ec-a4ef-84aceb7179a1, ip_allocation=immediate, mac_address=fa:16:3e:b6:d9:2a, name=tempest-NetworksTestDHCPv6-743822975, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['1cec0113-a0ae-49ad-b829-a2e2ac9244c4', '90eab0bc-4d60-4712-8f88-98aa394aa338'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:49Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1421, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:00:54Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:00:54 localhost nova_compute[281952]: 2025-11-23 10:00:54.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:54 localhost dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:00:54 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:54 localhost podman[318486]: 2025-11-23 10:00:54.736957696 +0000 UTC m=+0.052250341 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:00:54 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:55.207 263258 INFO neutron.agent.dhcp.agent [None req-85747f76-71b9-4c97-8364-49127a858e90 - - - - - -] DHCP configuration for ports {'70e0899a-4b27-45ec-a4ef-84aceb7179a1'} is completed#033[00m Nov 23 05:00:55 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:55.787 2 INFO neutron.agent.securitygroups_rpc [None req-f375f660-8a27-403d-b89f-e04d1f758be2 2cfd21f178604be289d8bb16b3b9c18f 0f8848490fb54a5cb41f1607121a115c - - default default] Security group member updated ['c9d46e70-8b37-41f1-b62d-e1679c8d4c9c']#033[00m Nov 23 05:00:56 localhost nova_compute[281952]: 2025-11-23 10:00:56.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:56 localhost nova_compute[281952]: 2025-11-23 10:00:56.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:00:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:00:56.616 2 INFO neutron.agent.securitygroups_rpc [None req-9e459c8e-445e-40f9-9db8-24291ed822a4 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:00:56 localhost dnsmasq[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:56 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:56 localhost podman[318525]: 2025-11-23 10:00:56.867860819 +0000 UTC m=+0.058221404 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:00:56 localhost dnsmasq-dhcp[318438]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:00:58 localhost dnsmasq[318438]: exiting on receipt of SIGTERM Nov 23 05:00:58 localhost podman[318566]: 2025-11-23 10:00:58.409994846 +0000 UTC m=+0.050306721 container kill e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:58 localhost systemd[1]: libpod-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope: Deactivated successfully. Nov 23 05:00:58 localhost podman[318581]: 2025-11-23 10:00:58.479785216 +0000 UTC m=+0.053569421 container died e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:00:58 localhost systemd[1]: tmp-crun.Nu0Zbk.mount: Deactivated successfully. Nov 23 05:00:58 localhost podman[318581]: 2025-11-23 10:00:58.518701075 +0000 UTC m=+0.092485250 container cleanup e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:00:58 localhost systemd[1]: libpod-conmon-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d.scope: Deactivated successfully. Nov 23 05:00:58 localhost podman[318580]: 2025-11-23 10:00:58.544465288 +0000 UTC m=+0.116238172 container remove e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:00:59 localhost systemd[1]: var-lib-containers-storage-overlay-b3db0c81f1e124d9d69f888333884ab63f725fc79d437184bc44f72fbf237162-merged.mount: Deactivated successfully. Nov 23 05:00:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e86c89ed09ae56a7e6097899c40de619ac9a490b2cd672daa4219cbf92db260d-userdata-shm.mount: Deactivated successfully. Nov 23 05:00:59 localhost podman[318659]: Nov 23 05:00:59 localhost podman[318659]: 2025-11-23 10:00:59.493560875 +0000 UTC m=+0.087031532 container create 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:59 localhost systemd[1]: Started libpod-conmon-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope. Nov 23 05:00:59 localhost systemd[1]: tmp-crun.608Xg4.mount: Deactivated successfully. Nov 23 05:00:59 localhost systemd[1]: Started libcrun container. Nov 23 05:00:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65662f1a439f1f795895507cf085174a5b8f6eedbf8689b96d3d51fbe5a3798d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:00:59 localhost podman[318659]: 2025-11-23 10:00:59.45675154 +0000 UTC m=+0.050222257 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:00:59 localhost podman[318659]: 2025-11-23 10:00:59.563659214 +0000 UTC m=+0.157129871 container init 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:00:59 localhost podman[318659]: 2025-11-23 10:00:59.572776545 +0000 UTC m=+0.166247202 container start 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:00:59 localhost dnsmasq[318678]: started, version 2.85 cachesize 150 Nov 23 05:00:59 localhost dnsmasq[318678]: DNS service limited to local subnets Nov 23 05:00:59 localhost dnsmasq[318678]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:00:59 localhost dnsmasq[318678]: warning: no upstream servers configured Nov 23 05:00:59 localhost dnsmasq-dhcp[318678]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:00:59 localhost dnsmasq[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:00:59 localhost dnsmasq-dhcp[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:00:59 localhost dnsmasq-dhcp[318678]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:00:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:00:59.828 263258 INFO neutron.agent.dhcp.agent [None req-a4eba1d5-a06b-4f60-8f7c-35388d336310 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:00:59 localhost dnsmasq[318678]: exiting on receipt of SIGTERM Nov 23 05:00:59 localhost podman[318696]: 2025-11-23 10:00:59.921669933 +0000 UTC m=+0.042041966 container kill 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:00:59 localhost systemd[1]: libpod-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope: Deactivated successfully. Nov 23 05:00:59 localhost openstack_network_exporter[242668]: ERROR 10:00:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:00:59 localhost openstack_network_exporter[242668]: ERROR 10:00:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:00:59 localhost openstack_network_exporter[242668]: ERROR 10:00:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:00:59 localhost openstack_network_exporter[242668]: ERROR 10:00:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:00:59 localhost openstack_network_exporter[242668]: Nov 23 05:00:59 localhost openstack_network_exporter[242668]: ERROR 10:00:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:00:59 localhost openstack_network_exporter[242668]: Nov 23 05:00:59 localhost podman[318709]: 2025-11-23 10:00:59.995355303 +0000 UTC m=+0.060922898 container died 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:00 localhost podman[318709]: 2025-11-23 10:01:00.029719752 +0000 UTC m=+0.095287367 container cleanup 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:00 localhost systemd[1]: libpod-conmon-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3.scope: Deactivated successfully. Nov 23 05:01:00 localhost podman[318711]: 2025-11-23 10:01:00.07605326 +0000 UTC m=+0.135822676 container remove 1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:00 localhost systemd[1]: var-lib-containers-storage-overlay-65662f1a439f1f795895507cf085174a5b8f6eedbf8689b96d3d51fbe5a3798d-merged.mount: Deactivated successfully. Nov 23 05:01:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dfa57a92fcb02923b21e185a4c698627f49ca4fc8f167ebb755b24e825785d3-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:00 localhost podman[318786]: Nov 23 05:01:00 localhost podman[318786]: 2025-11-23 10:01:00.917788069 +0000 UTC m=+0.074264148 container create cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:00 localhost systemd[1]: Started libpod-conmon-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope. Nov 23 05:01:00 localhost podman[318786]: 2025-11-23 10:01:00.886944579 +0000 UTC m=+0.043420698 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:00 localhost systemd[1]: Started libcrun container. Nov 23 05:01:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bef31403b2f37e816ee2cb2ed542775de4312b960541de464db08bad6657575/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:00 localhost podman[318786]: 2025-11-23 10:01:00.998925769 +0000 UTC m=+0.155401848 container init cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:01:01 localhost podman[318786]: 2025-11-23 10:01:01.007227484 +0000 UTC m=+0.163703563 container start cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:01:01 localhost dnsmasq[318805]: started, version 2.85 cachesize 150 Nov 23 05:01:01 localhost dnsmasq[318805]: DNS service limited to local subnets Nov 23 05:01:01 localhost dnsmasq[318805]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:01 localhost dnsmasq[318805]: warning: no upstream servers configured Nov 23 05:01:01 localhost dnsmasq-dhcp[318805]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:01 localhost dnsmasq[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:01 localhost dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:01 localhost dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:01.237 263258 INFO neutron.agent.dhcp.agent [None req-df7d977e-a85c-48c3-9718-c4dc21c10d72 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:01 localhost nova_compute[281952]: 2025-11-23 10:01:01.415 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:01 localhost nova_compute[281952]: 2025-11-23 10:01:01.424 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:02.104 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:02.106 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:02.108 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:02.108 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:02.109 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e83def7a-d9c0-4ef0-9509-3adae6d28c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:02 localhost nova_compute[281952]: 2025-11-23 10:01:02.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:02.877 2 INFO neutron.agent.securitygroups_rpc [None req-6be775ef-69f0-4d4b-8550-9fb2b9ad120e 2cfd21f178604be289d8bb16b3b9c18f 0f8848490fb54a5cb41f1607121a115c - - default default] Security group member updated ['c9d46e70-8b37-41f1-b62d-e1679c8d4c9c']#033[00m Nov 23 05:01:02 localhost dnsmasq[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:02 localhost dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:02 localhost dnsmasq-dhcp[318805]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:02 localhost podman[318832]: 2025-11-23 10:01:02.963216558 +0000 UTC m=+0.058194173 container kill cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:03 localhost ovn_controller[154788]: 2025-11-23T10:01:03Z|00245|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:03 localhost nova_compute[281952]: 2025-11-23 10:01:03.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:03.294 263258 INFO neutron.agent.dhcp.agent [None req-bdd693fd-2955-49da-819e-4a190ec18cb9 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:03 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:03.414 2 INFO neutron.agent.securitygroups_rpc [None req-4c75c4e7-1ac8-40bf-8bb2-ec0e75e4208b a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:01:04 localhost podman[318852]: 2025-11-23 10:01:04.031080534 +0000 UTC m=+0.085788633 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute) Nov 23 05:01:04 localhost podman[318852]: 2025-11-23 10:01:04.042357582 +0000 UTC m=+0.097065711 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Nov 23 05:01:04 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:01:04 localhost dnsmasq[318805]: exiting on receipt of SIGTERM Nov 23 05:01:04 localhost podman[318888]: 2025-11-23 10:01:04.317093945 +0000 UTC m=+0.067054746 container kill cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:04 localhost systemd[1]: libpod-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope: Deactivated successfully. Nov 23 05:01:04 localhost podman[318902]: 2025-11-23 10:01:04.383829771 +0000 UTC m=+0.055036396 container died cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:01:04 localhost podman[318902]: 2025-11-23 10:01:04.420642005 +0000 UTC m=+0.091848630 container cleanup cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:04 localhost systemd[1]: libpod-conmon-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3.scope: Deactivated successfully. Nov 23 05:01:04 localhost podman[318909]: 2025-11-23 10:01:04.461722331 +0000 UTC m=+0.121584266 container remove cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:01:05 localhost systemd[1]: var-lib-containers-storage-overlay-6bef31403b2f37e816ee2cb2ed542775de4312b960541de464db08bad6657575-merged.mount: Deactivated successfully. Nov 23 05:01:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cabe7b5e7d002675e19e5131f5fa06af060289b50e45be217f09c4195c05c9f3-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:05 localhost podman[318933]: 2025-11-23 10:01:05.036991141 +0000 UTC m=+0.091770147 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:01:05 localhost podman[318933]: 2025-11-23 10:01:05.045167753 +0000 UTC m=+0.099946779 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:01:05 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:01:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:05.717 263258 INFO neutron.agent.linux.ip_lib [None req-5d853321-94ca-4949-b3fd-1a7ae0b86b56 - - - - - -] Device tap3016fb40-93 cannot be used as it has no MAC address#033[00m Nov 23 05:01:05 localhost nova_compute[281952]: 2025-11-23 10:01:05.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:05 localhost kernel: device tap3016fb40-93 entered promiscuous mode Nov 23 05:01:05 localhost NetworkManager[5975]: [1763892065.7478] manager: (tap3016fb40-93): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Nov 23 05:01:05 localhost nova_compute[281952]: 2025-11-23 10:01:05.748 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:05 localhost ovn_controller[154788]: 2025-11-23T10:01:05Z|00246|binding|INFO|Claiming lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 for this chassis. Nov 23 05:01:05 localhost ovn_controller[154788]: 2025-11-23T10:01:05Z|00247|binding|INFO|3016fb40-93ab-4df3-956c-74e722dc2fa2: Claiming unknown Nov 23 05:01:05 localhost systemd-udevd[319006]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:05.760 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de25a77c-b919-41f6-9a1b-fd3e354e84bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3016fb40-93ab-4df3-956c-74e722dc2fa2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:05.761 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 3016fb40-93ab-4df3-956c-74e722dc2fa2 in datapath d0e9752d-2178-4eb0-b091-dd4d434021e5 bound to our chassis#033[00m Nov 23 05:01:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:05.762 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d0e9752d-2178-4eb0-b091-dd4d434021e5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:01:05 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:05.762 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[50cc67d6-e375-41fd-bd90-5ff995fc45a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:05 localhost ovn_controller[154788]: 2025-11-23T10:01:05Z|00248|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 ovn-installed in OVS Nov 23 05:01:05 localhost ovn_controller[154788]: 2025-11-23T10:01:05Z|00249|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 up in Southbound Nov 23 05:01:05 localhost nova_compute[281952]: 2025-11-23 10:01:05.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:05 localhost nova_compute[281952]: 2025-11-23 10:01:05.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:05 localhost nova_compute[281952]: 2025-11-23 10:01:05.833 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:05 localhost podman[319027]: Nov 23 05:01:05 localhost podman[319027]: 2025-11-23 10:01:05.930093364 +0000 UTC m=+0.071207864 container create 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:01:05 localhost systemd[1]: Started libpod-conmon-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope. Nov 23 05:01:05 localhost podman[319027]: 2025-11-23 10:01:05.888337218 +0000 UTC m=+0.029451728 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:05 localhost systemd[1]: Started libcrun container. Nov 23 05:01:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/899e9acb3d362d31418b6273b6c3450aeed57ce3d96336857afd4db90a4a068a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:06 localhost podman[319027]: 2025-11-23 10:01:06.007739536 +0000 UTC m=+0.148854036 container init 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:01:06 localhost podman[319027]: 2025-11-23 10:01:06.017834598 +0000 UTC m=+0.158949098 container start 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 05:01:06 localhost dnsmasq[319052]: started, version 2.85 cachesize 150 Nov 23 05:01:06 localhost dnsmasq[319052]: DNS service limited to local subnets Nov 23 05:01:06 localhost dnsmasq[319052]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:06 localhost dnsmasq[319052]: warning: no upstream servers configured Nov 23 05:01:06 localhost dnsmasq-dhcp[319052]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:06 localhost dnsmasq-dhcp[319052]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:06 localhost dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:06 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:06 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:06.337 263258 INFO neutron.agent.dhcp.agent [None req-05d11236-511b-490e-bf50-7df7d5b12e40 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:06 localhost nova_compute[281952]: 2025-11-23 10:01:06.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:06 localhost nova_compute[281952]: 2025-11-23 10:01:06.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:06 localhost podman[319090]: Nov 23 05:01:06 localhost podman[319090]: 2025-11-23 10:01:06.712202497 +0000 UTC m=+0.088660091 container create d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:06 localhost systemd[1]: Started libpod-conmon-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope. Nov 23 05:01:06 localhost systemd[1]: Started libcrun container. Nov 23 05:01:06 localhost podman[319090]: 2025-11-23 10:01:06.669837522 +0000 UTC m=+0.046295166 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b893594a3b7f21ead5de7308fbf557090d81368903c395e5edbb120a0248d337/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:06 localhost podman[319090]: 2025-11-23 10:01:06.784548836 +0000 UTC m=+0.161006430 container init d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:01:06 localhost podman[319090]: 2025-11-23 10:01:06.790882872 +0000 UTC m=+0.167340466 container start d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:01:06 localhost dnsmasq[319108]: started, version 2.85 cachesize 150 Nov 23 05:01:06 localhost dnsmasq[319108]: DNS service limited to local subnets Nov 23 05:01:06 localhost dnsmasq[319108]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:06 localhost dnsmasq[319108]: warning: no upstream servers configured Nov 23 05:01:06 localhost dnsmasq-dhcp[319108]: DHCP, static leases only on 10.103.0.0, lease time 1d Nov 23 05:01:06 localhost dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 0 addresses Nov 23 05:01:06 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host Nov 23 05:01:06 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts Nov 23 05:01:06 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e113 e113: 6 total, 6 up, 6 in Nov 23 05:01:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:06.987 263258 INFO neutron.agent.dhcp.agent [None req-f4c996c7-3eca-4f8d-bff8-201a20995022 - - - - - -] DHCP configuration for ports {'89de9cee-45b5-4855-ac12-45fb015ff1d7'} is completed#033[00m Nov 23 05:01:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:07.219 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.3 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:07.220 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:07.222 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 42a1827f-ec9e-4b50-b5fa-3bf13a8c1da0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:07.222 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:07.223 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcb4714-832d-4c3a-ad45-d41eec655a52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e114 e114: 6 total, 6 up, 6 in Nov 23 05:01:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.322 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b97eb3d5-f95d-413c-ad24-47f79a3d2882, ip_allocation=immediate, mac_address=fa:16:3e:27:e5:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:02Z, description=, dns_domain=, id=d0e9752d-2178-4eb0-b091-dd4d434021e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-635928538, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1457, status=ACTIVE, subnets=['e0fb5c14-e60a-431f-a11a-0be1e976c275'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:04Z, vlan_transparent=None, network_id=d0e9752d-2178-4eb0-b091-dd4d434021e5, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:08Z on network d0e9752d-2178-4eb0-b091-dd4d434021e5#033[00m Nov 23 05:01:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:08.498 2 INFO neutron.agent.securitygroups_rpc [None req-d30670ed-c29e-4282-92c2-f353a53316ea 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:08 localhost dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 1 addresses Nov 23 05:01:08 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host Nov 23 05:01:08 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts Nov 23 05:01:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.545 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=8e4502b6-105d-4e98-9e94-b93c8b85eb68, ip_allocation=immediate, mac_address=fa:16:3e:b1:eb:ac, name=tempest-NetworksTestDHCPv6-1058152510, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['e8e28b0a-fce4-45c6-b501-a05bd7ea67b9', 'fa0541e1-f00b-4a69-b039-1c3d2c2010ab'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:03Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1472, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:08Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:08 localhost podman[319127]: 2025-11-23 10:01:08.545385541 +0000 UTC m=+0.068716999 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:01:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:08.829 263258 INFO neutron.agent.dhcp.agent [None req-dbbdeeb0-5ec2-4c1b-86c5-563917620282 - - - - - -] DHCP configuration for ports {'b97eb3d5-f95d-413c-ad24-47f79a3d2882'} is completed#033[00m Nov 23 05:01:08 localhost podman[319165]: 2025-11-23 10:01:08.905473703 +0000 UTC m=+0.060906897 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:01:08 localhost dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:01:08 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:08 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:09.192 263258 INFO neutron.agent.dhcp.agent [None req-7a469220-c6c9-4613-bd8f-baccb0831258 - - - - - -] DHCP configuration for ports {'8e4502b6-105d-4e98-9e94-b93c8b85eb68'} is completed#033[00m Nov 23 05:01:09 localhost nova_compute[281952]: 2025-11-23 10:01:09.197 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:09.299 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:01:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:09.300 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:01:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:01:09 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:09.415 2 INFO neutron.agent.securitygroups_rpc [None req-ee598349-ae04-45e4-9403-8b439fe516e0 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:09 localhost dnsmasq[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:09 localhost podman[319204]: 2025-11-23 10:01:09.666005512 +0000 UTC m=+0.062438574 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:09 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:09 localhost dnsmasq-dhcp[319052]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.835 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7756857-11f3-43ee-8351-eff5c460aa7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.810414', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568c3494-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '0100ba19e0b8bac0bd93a13dd9c2b09cc7943e401296e6e23697a0a6386ef330'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.810414', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568c448e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '4976d2a99748263577d4c1ec9b799520629e699cda78994ed442b8c2c37d2b12'}]}, 'timestamp': '2025-11-23 10:01:10.836097', '_unique_id': '66d94295cec14f45845baf327fd9ee4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.837 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a098503-56ca-4ce3-9075-e1c347c995a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.838590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568cb2ca-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'ad286d0e2d90e07ec3d86b8f7c3b35c92282a0176cf6169a409adabcb2eb419d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.838590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568cbfc2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'b29669fc20955e9e0f315081a027829db3f2154dc407f3f7d045218107c44bea'}]}, 'timestamp': '2025-11-23 10:01:10.839227', '_unique_id': '876eb37fb1e9435f9e2e246a119605b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.839 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cc8ac29-297a-4f20-8ab2-254900ab75c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.840650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568d02e8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'f3753301183d396104c3fe137c6fd0fd6820dda7d2378268c96d0ce0caf67b50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.840650', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568d0ea0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'df937c906553f524a5d57fc63e696ccc043fc2528df1f9c8d65a315615561dff'}]}, 'timestamp': '2025-11-23 10:01:10.841241', '_unique_id': '9f00ca414a9f457496abb78ece338ee3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.841 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.842 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a9c2918-3749-4314-abd5-a39ca618d12a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.842653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '568d5144-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'cad3dd91056d9c555f2589c920cc10399648382a0333ee44682532a9578b1c91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.842653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '568d5b80-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '6393aa92b852f4efcad76ef4bc2067db5c7685c5b2a4912b1fa693eb8d24b8dc'}]}, 'timestamp': '2025-11-23 10:01:10.843211', '_unique_id': '3c0317c505b64e75891aa7ebf8eb93e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.844 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:01:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:10.845 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:07Z, description=, device_id=eaab74f7-5e3e-4996-bbac-c869375065e2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b97eb3d5-f95d-413c-ad24-47f79a3d2882, ip_allocation=immediate, mac_address=fa:16:3e:27:e5:8b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:02Z, description=, dns_domain=, id=d0e9752d-2178-4eb0-b091-dd4d434021e5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-635928538, port_security_enabled=True, project_id=79509bc833494f3598e01347dc55dea9, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1457, status=ACTIVE, subnets=['e0fb5c14-e60a-431f-a11a-0be1e976c275'], tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:04Z, vlan_transparent=None, network_id=d0e9752d-2178-4eb0-b091-dd4d434021e5, port_security_enabled=False, project_id=79509bc833494f3598e01347dc55dea9, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1473, status=DOWN, tags=[], tenant_id=79509bc833494f3598e01347dc55dea9, updated_at=2025-11-23T10:01:08Z on network d0e9752d-2178-4eb0-b091-dd4d434021e5#033[00m Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8269b53-69ce-463c-929a-e1f09097f9a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.844584', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '568e4112-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '8b3f1c3e9f6d55f62bdf973f172a4331f76484aa8c557b5be3b6d3020cf5f1f2'}]}, 'timestamp': '2025-11-23 10:01:10.849133', '_unique_id': '17e77d7785c14ea2954011b740a306ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 16590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1694dab-39c3-40ef-b3bd-0ae80cb94051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16590000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:01:10.850726', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '56909d18-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.041592407, 'message_signature': '3b704ffdb342c0d9a37c6765a03c6c0c8bbf4b047a8020f7ab2f284301333fb0'}]}, 'timestamp': '2025-11-23 10:01:10.864615', '_unique_id': '4594194ee12f40d68e2894c3f5b40132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.865 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.866 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ef142de-ebb5-40d6-8110-6ee29d459d75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.866523', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5690f632-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '49b6127de0c7ea6d878c2f6d7ecd1d8db801e77ab7b9e68d843fec04399dc017'}]}, 'timestamp': '2025-11-23 10:01:10.866907', '_unique_id': 'b3eb20f9e18544af83356c680a18396b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.867 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5536d7a4-3a9d-48dc-8f59-0c7518fe4758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.868600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5691468c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'ff6a9a8d92f80bdc634cd5cba86804b920f9462282f38d5b736287b97e250127'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.868600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56915190-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '6e952e36bd9495ac036e11066211521c742123c523c3c49af6a4f99a32a13d78'}]}, 'timestamp': '2025-11-23 10:01:10.869164', '_unique_id': '65f19c359f9941769ca553e0ff9ecb35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14a49a84-91e7-48e2-80e6-2964eb5cb7bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.870607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56931fb6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '7d44dfb1d71927fc6e58a813a553500180c414b7df4c87a5aea6fc8489885c4b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.870607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '56932e20-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': 'ea73a86f4305fb1cf2c850f65c8f72681651da6156771524843d2739b0657612'}]}, 'timestamp': '2025-11-23 10:01:10.881384', '_unique_id': 'c01dc75df53f413c92050e9279561b30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f72bd289-2bca-4bfc-9ca8-e8765545ee45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.883300', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56938546-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '7755703b120c3a6542d5f2699e287b90fc521f22a628b61d510205e10635901c'}]}, 'timestamp': '2025-11-23 10:01:10.883651', '_unique_id': '6101097430774cd99ad5dc48e5b57acf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f62a3db9-b710-48a6-a6ca-161a1c578b54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.885228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5693d08c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': '87a0490763d3b5c98a8331fe7e81ffb36dee439850287f91f004b0f003b82937'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.885228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5693dbcc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12013.988042268, 'message_signature': 'e718af9a10ea165d6660524c7fae8e510b747a8f25e804aa02a41a308a6158aa'}]}, 'timestamp': '2025-11-23 10:01:10.885821', '_unique_id': '7989bfab31954f4996e7df50eeb31148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.887 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.887 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3ff272e-3dd0-4a85-b36d-2eab3b7176dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.887451', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56942730-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '2c07eb7c36754ad64283f70c3641a823b6565a793f48d34a5b00454e7de182d7'}]}, 'timestamp': '2025-11-23 10:01:10.887764', '_unique_id': '22d37ce43957405484a7338f1c22732b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.889 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47c1c677-9cb2-4511-a423-de5663925535', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.889235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '56946cae-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '78e9e872415e267599f6704945d537c1e4175422b55544c71e4df435188d3370'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.889235', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5694778a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '02df1829ff3ab380a03b537ea4c67a9db3f20c8c7d010cf0f86c88138cccf321'}]}, 'timestamp': '2025-11-23 10:01:10.889799', '_unique_id': '461aa7e7786b424995007153ae198e3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c67f04-51d6-4f06-9080-f81ea9e40db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.891347', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5694bf24-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'a20ee216bfb3d1a798172e8de7ce1a965c00e9120d9ee58101ed1813c932b929'}]}, 'timestamp': '2025-11-23 10:01:10.891653', '_unique_id': 'b25229176a1846198d67971db53f79af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.892 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79ded583-9029-449d-acae-9dd59ba61969', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.893416', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56950fc4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'e940b4aae7d3b7653565c26d6770127fb4e073efb4b201d218d18d9d7bfafb95'}]}, 'timestamp': '2025-11-23 10:01:10.893717', '_unique_id': 'cd21ff49a035494ca77c1ab5cf3cffe5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90b7663d-6397-4f47-b425-94f57b0325fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:01:10.895206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '569555f6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.041592407, 'message_signature': '2c7d37ca84ec6d3ac96dc8caf9518e5c6219d6f566877a29ad8234eadff8020d'}]}, 'timestamp': '2025-11-23 10:01:10.895506', '_unique_id': '26101b24841a4a72a83bfee7be604bcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.896 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d40326e-e68d-4ed1-b22e-3ffe062a5531', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.896973', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56959b06-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '901acacef84c13b274bae5e75e2e58f65db5dd523563b25c118531ffb1ec2fde'}]}, 'timestamp': '2025-11-23 10:01:10.897283', '_unique_id': 'd7435fe0962d425a9bcdf249c24a9e51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.897 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.898 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7d1c39c-d27d-4b2b-ac49-bd6f5d3d0c3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.898688', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5695ddfa-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '664d7ef3ad580c211d0fd5c36396b74aece2eb8e8b96efe48627347ca3016c7e'}]}, 'timestamp': '2025-11-23 10:01:10.899064', '_unique_id': '174a0203678446e1b186070a46dc7fb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73ecd5d3-7d74-4a4b-ab33-0b38cf9109e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.900692', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '56962c42-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': '8a4cc52d3e4f9b184b2447417d70a1ca50f5370dfbd332ceb27ed1d128b2cb26'}]}, 'timestamp': '2025-11-23 10:01:10.901030', '_unique_id': 'f1b9d010a57d4f90bc02417f06cd10ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.902 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd402703-5ca7-4c95-94a1-8cac8406cbfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:01:10.902586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '569675da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '2cbf3db59250c046d40651b33910ebb6575335875209c22d9b1e020c29150d9b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:01:10.902586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '569681ce-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.048240033, 'message_signature': '8c91e4c8f360893ee8907c27f3d8ab9787f76313a900f99234bc1e9adf70bbf9'}]}, 'timestamp': '2025-11-23 10:01:10.903170', '_unique_id': 'efedb6418f684400bc4753944eb56f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost systemd-journald[48157]: Data hash table of /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Nov 23 05:01:10 localhost systemd-journald[48157]: /run/log/journal/6e0090cd4cf296f54418e234b90f721c/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 23 05:01:10 localhost rsyslogd[760]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.903 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.904 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36e55255-07d8-48fb-99f2-7104d8978538', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:01:10.904693', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '5696c83c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12014.022214481, 'message_signature': 'a15212f16063793f4e80404a4b8a9000b34ecad8be163a159aaeca788a19b437'}]}, 'timestamp': '2025-11-23 10:01:10.905021', '_unique_id': '875957f53e8c4e4284b4d8bf1ff61955'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:01:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:01:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:01:11 localhost dnsmasq[319052]: exiting on receipt of SIGTERM Nov 23 05:01:11 localhost systemd[1]: tmp-crun.LEFP9Y.mount: Deactivated successfully. Nov 23 05:01:11 localhost systemd[1]: libpod-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope: Deactivated successfully. Nov 23 05:01:11 localhost podman[319255]: 2025-11-23 10:01:11.057840058 +0000 UTC m=+0.069560364 container kill 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:11 localhost podman[319262]: 2025-11-23 10:01:11.077952748 +0000 UTC m=+0.063320832 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:11 localhost dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 1 addresses Nov 23 05:01:11 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host Nov 23 05:01:11 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts Nov 23 05:01:11 localhost podman[319281]: 2025-11-23 10:01:11.138276425 +0000 UTC m=+0.063554868 container died 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:11 localhost podman[319281]: 2025-11-23 10:01:11.170175039 +0000 UTC m=+0.095453402 container cleanup 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:01:11 localhost systemd[1]: libpod-conmon-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a.scope: Deactivated successfully. Nov 23 05:01:11 localhost podman[319283]: 2025-11-23 10:01:11.215139094 +0000 UTC m=+0.125982462 container remove 6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:01:11 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:11.284 2 INFO neutron.agent.securitygroups_rpc [None req-4b64e528-1440-49a5-b870-0a0f4d60c275 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:11 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:01:11 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:01:11 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:01:11 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/182625698' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:01:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:11.353 263258 INFO neutron.agent.dhcp.agent [None req-842206c5-f2eb-4756-9d89-5375128d7158 - - - - - -] DHCP configuration for ports {'b97eb3d5-f95d-413c-ad24-47f79a3d2882'} is completed#033[00m Nov 23 05:01:11 localhost nova_compute[281952]: 2025-11-23 10:01:11.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:11 localhost nova_compute[281952]: 2025-11-23 10:01:11.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:11 localhost podman[240668]: time="2025-11-23T10:01:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:01:11 localhost podman[240668]: @ - - [23/Nov/2025:10:01:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159333 "" "Go-http-client/1.1" Nov 23 05:01:11 localhost podman[240668]: @ - - [23/Nov/2025:10:01:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20194 "" "Go-http-client/1.1" Nov 23 05:01:12 localhost podman[319369]: Nov 23 05:01:12 localhost podman[319369]: 2025-11-23 10:01:12.01550805 +0000 UTC m=+0.144161553 container create 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:12 localhost systemd[1]: Started libpod-conmon-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope. Nov 23 05:01:12 localhost systemd[1]: var-lib-containers-storage-overlay-899e9acb3d362d31418b6273b6c3450aeed57ce3d96336857afd4db90a4a068a-merged.mount: Deactivated successfully. Nov 23 05:01:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6821214eea18a75a948b51c5fb1d8e49e6164aa1e679a3a8c3c1f4f637ef916a-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:12 localhost systemd[1]: Started libcrun container. Nov 23 05:01:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48c328cc9cc6bdb79002724ef74b00a7bbe9050aa89c002fb9e7e8bd3aac1e06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:12 localhost podman[319369]: 2025-11-23 10:01:11.977036474 +0000 UTC m=+0.105690047 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:12 localhost podman[319369]: 2025-11-23 10:01:12.080054258 +0000 UTC m=+0.208707761 container init 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:01:12 localhost podman[319369]: 2025-11-23 10:01:12.090111127 +0000 UTC m=+0.218764620 container start 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:12 localhost dnsmasq[319387]: started, version 2.85 cachesize 150 Nov 23 05:01:12 localhost dnsmasq[319387]: DNS service limited to local subnets Nov 23 05:01:12 localhost dnsmasq[319387]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:12 localhost dnsmasq[319387]: warning: no upstream servers configured Nov 23 05:01:12 localhost dnsmasq-dhcp[319387]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:12 localhost dnsmasq[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:12 localhost dnsmasq-dhcp[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:12 localhost dnsmasq-dhcp[319387]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:12.329 263258 INFO neutron.agent.dhcp.agent [None req-c0715c88-85f7-4f66-b68c-48c4d6f20ef2 - - - - - -] DHCP configuration for ports {'01290de6-b98c-45d9-85a6-538e887d4c81', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:12 localhost dnsmasq[319387]: exiting on receipt of SIGTERM Nov 23 05:01:12 localhost podman[319403]: 2025-11-23 10:01:12.435785367 +0000 UTC m=+0.061992352 container kill 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:12 localhost systemd[1]: libpod-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope: Deactivated successfully. Nov 23 05:01:12 localhost podman[319417]: 2025-11-23 10:01:12.506491914 +0000 UTC m=+0.055371437 container died 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:01:12 localhost podman[319417]: 2025-11-23 10:01:12.543765702 +0000 UTC m=+0.092645195 container cleanup 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:01:12 localhost systemd[1]: libpod-conmon-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837.scope: Deactivated successfully. Nov 23 05:01:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 e115: 6 total, 6 up, 6 in Nov 23 05:01:12 localhost podman[319418]: 2025-11-23 10:01:12.580134483 +0000 UTC m=+0.121914086 container remove 4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:12 localhost kernel: device tap01290de6-b9 left promiscuous mode Nov 23 05:01:12 localhost ovn_controller[154788]: 2025-11-23T10:01:12Z|00250|binding|INFO|Releasing lport 01290de6-b98c-45d9-85a6-538e887d4c81 from this chassis (sb_readonly=0) Nov 23 05:01:12 localhost nova_compute[281952]: 2025-11-23 10:01:12.643 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:12 localhost ovn_controller[154788]: 2025-11-23T10:01:12Z|00251|binding|INFO|Setting lport 01290de6-b98c-45d9-85a6-538e887d4c81 down in Southbound Nov 23 05:01:12 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:12.656 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fefd:15c6/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=01290de6-b98c-45d9-85a6-538e887d4c81) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:12 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:12.658 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 01290de6-b98c-45d9-85a6-538e887d4c81 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:01:12 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:12.661 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:12 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:12.662 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e877071c-3e77-41f6-8bc9-1e30b3e2a1bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:12 localhost nova_compute[281952]: 2025-11-23 10:01:12.665 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:13.026 263258 INFO neutron.agent.dhcp.agent [None req-0fe671d9-6c7a-4b47-96ce-ebf75861e63e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:13 localhost systemd[1]: var-lib-containers-storage-overlay-48c328cc9cc6bdb79002724ef74b00a7bbe9050aa89c002fb9e7e8bd3aac1e06-merged.mount: Deactivated successfully. Nov 23 05:01:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4903f0d185d213a2ba432677f79ed36b46b51a08f485dba27d4a52b92f5ce837-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:13 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:01:13 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:13.106 2 INFO neutron.agent.securitygroups_rpc [None req-a6cbb5e4-51d9-4058-bf96-80e547b16a25 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:13 localhost nova_compute[281952]: 2025-11-23 10:01:13.210 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:13 localhost nova_compute[281952]: 2025-11-23 10:01:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:13 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 23 05:01:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:13.823 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.3 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:13.826 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:13.829 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:13.830 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[144cc3ab-55b7-4daf-81cb-23f53dad8846]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.473 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.474 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.474 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:01:14 localhost nova_compute[281952]: 2025-11-23 10:01:14.475 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:01:14 localhost podman[319451]: 2025-11-23 10:01:14.952399551 +0000 UTC m=+0.066740937 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 05:01:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:14.955 263258 INFO neutron.agent.linux.ip_lib [None req-0bb28557-ccbc-4c5e-be5a-312a54185d11 - - - - - -] Device tap67c1b2ca-3e cannot be used as it has no MAC address#033[00m Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.022 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost kernel: device tap67c1b2ca-3e entered promiscuous mode Nov 23 05:01:15 localhost NetworkManager[5975]: [1763892075.0280] manager: (tap67c1b2ca-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Nov 23 05:01:15 localhost ovn_controller[154788]: 2025-11-23T10:01:15Z|00252|binding|INFO|Claiming lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a for this chassis. Nov 23 05:01:15 localhost ovn_controller[154788]: 2025-11-23T10:01:15Z|00253|binding|INFO|67c1b2ca-3e93-4592-92bd-bb626f12e09a: Claiming unknown Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.027 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost ovn_controller[154788]: 2025-11-23T10:01:15Z|00254|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a ovn-installed in OVS Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost systemd-udevd[319494]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.035 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost ovn_controller[154788]: 2025-11-23T10:01:15Z|00255|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a up in Southbound Nov 23 05:01:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:15.042 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=67c1b2ca-3e93-4592-92bd-bb626f12e09a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:15.044 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 67c1b2ca-3e93-4592-92bd-bb626f12e09a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:01:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:15.045 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:15.045 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:15.046 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[250861cb-1f84-403c-9966-063aa7b8d9ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost podman[319453]: 2025-11-23 10:01:15.058924393 +0000 UTC m=+0.169091170 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost journal[230249]: ethtool ioctl error on tap67c1b2ca-3e: No such device Nov 23 05:01:15 localhost podman[319451]: 2025-11-23 10:01:15.075442802 +0000 UTC m=+0.189784198 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.083 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost podman[319453]: 2025-11-23 10:01:15.087796362 +0000 UTC m=+0.197963119 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 05:01:15 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:01:15 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:01:15 localhost nova_compute[281952]: 2025-11-23 10:01:15.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:15 localhost systemd[1]: tmp-crun.KWJPmW.mount: Deactivated successfully. Nov 23 05:01:15 localhost podman[319450]: 2025-11-23 10:01:15.180706404 +0000 UTC m=+0.294535044 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller) Nov 23 05:01:15 localhost podman[319450]: 2025-11-23 10:01:15.213288128 +0000 UTC m=+0.327116728 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 23 05:01:15 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:01:15 localhost podman[319588]: Nov 23 05:01:15 localhost podman[319588]: 2025-11-23 10:01:15.927064926 +0000 UTC m=+0.089579200 container create cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:01:15 localhost systemd[1]: Started libpod-conmon-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope. Nov 23 05:01:15 localhost podman[319588]: 2025-11-23 10:01:15.886036242 +0000 UTC m=+0.048550516 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:15 localhost systemd[1]: Started libcrun container. Nov 23 05:01:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6504fdec3be8c610cf72edfd595f3b7128e3fb811123dbbe86b75a09399d63a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:16 localhost podman[319588]: 2025-11-23 10:01:16.005433271 +0000 UTC m=+0.167947535 container init cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:01:16 localhost podman[319588]: 2025-11-23 10:01:16.015103678 +0000 UTC m=+0.177617942 container start cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:01:16 localhost dnsmasq[319607]: started, version 2.85 cachesize 150 Nov 23 05:01:16 localhost dnsmasq[319607]: DNS service limited to local subnets Nov 23 05:01:16 localhost dnsmasq[319607]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:16 localhost dnsmasq[319607]: warning: no upstream servers configured Nov 23 05:01:16 localhost dnsmasq-dhcp[319607]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:16 localhost dnsmasq[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:16 localhost dnsmasq-dhcp[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:16 localhost dnsmasq-dhcp[319607]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:16.226 263258 INFO neutron.agent.dhcp.agent [None req-b633e6aa-3662-43dc-b39e-ecdb0bbf128a - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:16.344 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:16.345 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:16.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:16.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:16.350 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[373f7654-0d98-41dd-a74b-fcb96c865d9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.395 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.419 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.419 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.421 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:01:16 localhost nova_compute[281952]: 2025-11-23 10:01:16.455 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:16 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:16.647 2 INFO neutron.agent.securitygroups_rpc [None req-49a4dc10-ae4e-41b1-8135-2f2053e37dc6 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:16 localhost dnsmasq[319607]: exiting on receipt of SIGTERM Nov 23 05:01:16 localhost podman[319625]: 2025-11-23 10:01:16.685796319 +0000 UTC m=+0.058682538 container kill cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:16 localhost systemd[1]: libpod-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope: Deactivated successfully. Nov 23 05:01:16 localhost podman[319640]: 2025-11-23 10:01:16.75592007 +0000 UTC m=+0.060733893 container died cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:16 localhost podman[319640]: 2025-11-23 10:01:16.791177715 +0000 UTC m=+0.095991468 container cleanup cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:16 localhost systemd[1]: libpod-conmon-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387.scope: Deactivated successfully. Nov 23 05:01:16 localhost podman[319647]: 2025-11-23 10:01:16.841734312 +0000 UTC m=+0.130099918 container remove cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:01:16 localhost systemd[1]: var-lib-containers-storage-overlay-d6504fdec3be8c610cf72edfd595f3b7128e3fb811123dbbe86b75a09399d63a-merged.mount: Deactivated successfully. Nov 23 05:01:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf98874f931761505bf6e5fb8352a161642ec616b5dfebc9e9e5ac64ee8c4387-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:17 localhost nova_compute[281952]: 2025-11-23 10:01:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:17 localhost dnsmasq[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/addn_hosts - 0 addresses Nov 23 05:01:17 localhost podman[319712]: 2025-11-23 10:01:17.567311474 +0000 UTC m=+0.061269668 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:01:17 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/host Nov 23 05:01:17 localhost dnsmasq-dhcp[319108]: read /var/lib/neutron/dhcp/d0e9752d-2178-4eb0-b091-dd4d434021e5/opts Nov 23 05:01:17 localhost podman[319752]: Nov 23 05:01:17 localhost podman[319752]: 2025-11-23 10:01:17.774011672 +0000 UTC m=+0.093967975 container create 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:01:17 localhost systemd[1]: Started libpod-conmon-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope. Nov 23 05:01:17 localhost podman[319752]: 2025-11-23 10:01:17.729717867 +0000 UTC m=+0.049674230 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:17 localhost systemd[1]: Started libcrun container. Nov 23 05:01:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b617da3d100e467919636d843ab26dcfbf14a001e0fc52c63ea40186eb9f6dfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:17 localhost podman[319752]: 2025-11-23 10:01:17.843235025 +0000 UTC m=+0.163191328 container init 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:17 localhost podman[319752]: 2025-11-23 10:01:17.852229002 +0000 UTC m=+0.172185315 container start 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:17 localhost dnsmasq[319773]: started, version 2.85 cachesize 150 Nov 23 05:01:17 localhost dnsmasq[319773]: DNS service limited to local subnets Nov 23 05:01:17 localhost dnsmasq[319773]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:17 localhost dnsmasq[319773]: warning: no upstream servers configured Nov 23 05:01:17 localhost dnsmasq-dhcp[319773]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:17 localhost dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:17 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:17 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:17 localhost systemd[1]: tmp-crun.EQ8Snn.mount: Deactivated successfully. Nov 23 05:01:18 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:18.011 263258 INFO neutron.agent.dhcp.agent [None req-e06b79e9-b2de-4d34-b138-d1fabb92b56f - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.234 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:01:18 localhost nova_compute[281952]: 2025-11-23 10:01:18.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:01:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:18 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:18.573 2 INFO neutron.agent.securitygroups_rpc [None req-599670a7-0640-44a2-ad54-406dd4624d40 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.159 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:17Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=dbfc41c4-f19b-4c9e-a49b-4b0a906e0b61, ip_allocation=immediate, mac_address=fa:16:3e:ee:36:51, name=tempest-NetworksTestDHCPv6-557607304, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['31b40284-1383-49a1-83f1-1def70e46b7c', '689a83b7-98fd-403b-84c3-bcdcd384cd1f'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:14Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1526, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:18Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:19 localhost kernel: device tap3016fb40-93 left promiscuous mode Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00256|binding|INFO|Releasing lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 from this chassis (sb_readonly=0) Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00257|binding|INFO|Setting lport 3016fb40-93ab-4df3-956c-74e722dc2fa2 down in Southbound Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.219 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0e9752d-2178-4eb0-b091-dd4d434021e5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de25a77c-b919-41f6-9a1b-fd3e354e84bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3016fb40-93ab-4df3-956c-74e722dc2fa2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.221 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 3016fb40-93ab-4df3-956c-74e722dc2fa2 in datapath d0e9752d-2178-4eb0-b091-dd4d434021e5 unbound from our chassis#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.224 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0e9752d-2178-4eb0-b091-dd4d434021e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.225 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c27c55d7-e81c-4d35-9805-596848d4b74c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:19.269 2 INFO neutron.agent.securitygroups_rpc [None req-7242a629-d88b-4313-9e67-39aa189122ef fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.298 263258 INFO neutron.agent.linux.ip_lib [None req-96ca0c12-1709-4815-824e-eabac43a2935 - - - - - -] Device tap9a92ea95-74 cannot be used as it has no MAC address#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.321 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost sshd[319816]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:01:19 localhost kernel: device tap9a92ea95-74 entered promiscuous mode Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost NetworkManager[5975]: [1763892079.3284] manager: (tap9a92ea95-74): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00258|binding|INFO|Claiming lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 for this chassis. Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00259|binding|INFO|9a92ea95-742f-47d6-b9a5-b24454278ac2: Claiming unknown Nov 23 05:01:19 localhost systemd-udevd[319819]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.346 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a180dbb035ce42ac9ec3178829ba27ed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fbea3ea-671c-4b0c-8df6-a11c66c76ac2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9a92ea95-742f-47d6-b9a5-b24454278ac2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.347 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9a92ea95-742f-47d6-b9a5-b24454278ac2 in datapath a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee bound to our chassis#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.349 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:01:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:19.349 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4372becd-27b0-4c99-8fd5-fe6dd4aaf531]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.356 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00260|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 ovn-installed in OVS Nov 23 05:01:19 localhost ovn_controller[154788]: 2025-11-23T10:01:19Z|00261|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 up in Southbound Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost journal[230249]: ethtool ioctl error on tap9a92ea95-74: No such device Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.441 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:19 localhost systemd[1]: tmp-crun.KrFQML.mount: Deactivated successfully. Nov 23 05:01:19 localhost podman[319827]: 2025-11-23 10:01:19.460959269 +0000 UTC m=+0.097838095 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:19 localhost dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:01:19 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:19 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:01:19 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/437465050' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.513 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.278s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.584 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.585 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:01:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:19.728 263258 INFO neutron.agent.dhcp.agent [None req-484ed451-1cab-473f-801c-5e2b0b9ad6c2 - - - - - -] DHCP configuration for ports {'dbfc41c4-f19b-4c9e-a49b-4b0a906e0b61'} is completed#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.797 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.799 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11199MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.800 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.800 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.894 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.895 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.895 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:01:19 localhost nova_compute[281952]: 2025-11-23 10:01:19.937 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:01:20 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:20.239 2 INFO neutron.agent.securitygroups_rpc [None req-55e03d23-65b9-4d8a-853a-a5da3e2be7a3 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:20 localhost podman[319940]: Nov 23 05:01:20 localhost podman[319940]: 2025-11-23 10:01:20.254767193 +0000 UTC m=+0.082527824 container create 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:01:20 localhost systemd[1]: Started libpod-conmon-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope. Nov 23 05:01:20 localhost podman[319940]: 2025-11-23 10:01:20.210882351 +0000 UTC m=+0.038643022 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:20 localhost systemd[1]: Started libcrun container. Nov 23 05:01:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82de1d62aafaefd90894da078f8a1a469742bc8b5c3eb4a9608ec437320d7a2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:20 localhost podman[319940]: 2025-11-23 10:01:20.33192916 +0000 UTC m=+0.159689801 container init 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:01:20 localhost podman[319940]: 2025-11-23 10:01:20.352099511 +0000 UTC m=+0.179860142 container start 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:01:20 localhost dnsmasq[319974]: started, version 2.85 cachesize 150 Nov 23 05:01:20 localhost dnsmasq[319974]: DNS service limited to local subnets Nov 23 05:01:20 localhost dnsmasq[319974]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:20 localhost dnsmasq[319974]: warning: no upstream servers configured Nov 23 05:01:20 localhost dnsmasq-dhcp[319974]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:20 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 0 addresses Nov 23 05:01:20 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:20 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:01:20 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3857005358' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.420 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.421 263258 INFO neutron.agent.dhcp.agent [None req-96ca0c12-1709-4815-824e-eabac43a2935 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=dcca1c20-9ecb-4374-ad1b-f605ac775db9, ip_allocation=immediate, mac_address=fa:16:3e:fc:db:47, name=tempest-AllowedAddressPairIpV6TestJSON-769159927, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1528, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:18Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.434 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.439 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.456 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.458 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:01:20 localhost nova_compute[281952]: 2025-11-23 10:01:20.458 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:01:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.473 263258 INFO neutron.agent.dhcp.agent [None req-2377f124-9d87-41f7-9796-6138ac709a7b - - - - - -] DHCP configuration for ports {'c2a69d57-5768-4e92-a190-af21210eb643'} is completed#033[00m Nov 23 05:01:20 localhost dnsmasq[319108]: exiting on receipt of SIGTERM Nov 23 05:01:20 localhost podman[319992]: 2025-11-23 10:01:20.477970908 +0000 UTC m=+0.037611279 container kill d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:20 localhost systemd[1]: libpod-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope: Deactivated successfully. Nov 23 05:01:20 localhost podman[320020]: 2025-11-23 10:01:20.549337037 +0000 UTC m=+0.055390237 container died d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:01:20 localhost podman[320020]: 2025-11-23 10:01:20.580150987 +0000 UTC m=+0.086204157 container cleanup d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:01:20 localhost systemd[1]: libpod-conmon-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186.scope: Deactivated successfully. Nov 23 05:01:20 localhost podman[320021]: 2025-11-23 10:01:20.621708717 +0000 UTC m=+0.126407055 container remove d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d0e9752d-2178-4eb0-b091-dd4d434021e5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:01:20 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses Nov 23 05:01:20 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:20 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:20 localhost podman[320054]: 2025-11-23 10:01:20.647749219 +0000 UTC m=+0.099152485 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:20 localhost dnsmasq[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:20 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:20 localhost dnsmasq-dhcp[319773]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:20 localhost podman[320015]: 2025-11-23 10:01:20.661619876 +0000 UTC m=+0.172254598 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:01:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.849 263258 INFO neutron.agent.dhcp.agent [None req-4770a38f-60f3-41a2-9dff-e09784566601 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.915 263258 INFO neutron.agent.dhcp.agent [None req-ac4d95ed-da65-419c-993b-bab8231fff13 - - - - - -] DHCP configuration for ports {'dcca1c20-9ecb-4374-ad1b-f605ac775db9'} is completed#033[00m Nov 23 05:01:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:20.961 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:21 localhost systemd[1]: tmp-crun.yICxm0.mount: Deactivated successfully. Nov 23 05:01:21 localhost systemd[1]: var-lib-containers-storage-overlay-b893594a3b7f21ead5de7308fbf557090d81368903c395e5edbb120a0248d337-merged.mount: Deactivated successfully. Nov 23 05:01:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5ac007852c109ffc5f7922587f0ed4f5f201aab7b823bca1278e0d72c3f7186-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:21 localhost systemd[1]: run-netns-qdhcp\x2dd0e9752d\x2d2178\x2d4eb0\x2db091\x2ddd4d434021e5.mount: Deactivated successfully. Nov 23 05:01:21 localhost nova_compute[281952]: 2025-11-23 10:01:21.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:21 localhost nova_compute[281952]: 2025-11-23 10:01:21.461 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:21 localhost nova_compute[281952]: 2025-11-23 10:01:21.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:21 localhost nova_compute[281952]: 2025-11-23 10:01:21.462 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:21 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:21.886 2 INFO neutron.agent.securitygroups_rpc [None req-da0cfd7f-9c49-4dab-bd78-effdeef69255 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:21.993 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=be42f629-f772-4cb8-bcb6-19f1e43b36b3, ip_allocation=immediate, mac_address=fa:16:3e:7c:74:ec, name=tempest-AllowedAddressPairIpV6TestJSON-1721931869, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1536, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:20Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:22 localhost dnsmasq[319773]: exiting on receipt of SIGTERM Nov 23 05:01:22 localhost podman[320111]: 2025-11-23 10:01:22.055821935 +0000 UTC m=+0.064164878 container kill 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:01:22 localhost systemd[1]: libpod-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope: Deactivated successfully. Nov 23 05:01:22 localhost podman[320135]: 2025-11-23 10:01:22.131981011 +0000 UTC m=+0.064919671 container died 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:22 localhost podman[320135]: 2025-11-23 10:01:22.167093672 +0000 UTC m=+0.100032312 container cleanup 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:01:22 localhost systemd[1]: libpod-conmon-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18.scope: Deactivated successfully. Nov 23 05:01:22 localhost podman[320142]: 2025-11-23 10:01:22.226188343 +0000 UTC m=+0.148457275 container remove 92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:22 localhost systemd[1]: tmp-crun.jqERwL.mount: Deactivated successfully. Nov 23 05:01:22 localhost systemd[1]: var-lib-containers-storage-overlay-b617da3d100e467919636d843ab26dcfbf14a001e0fc52c63ea40186eb9f6dfb-merged.mount: Deactivated successfully. Nov 23 05:01:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92c9bc4984662b410757b3bf20d0fa406d841579e0b89c958afecaaebba05c18-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:22.268 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:22 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses Nov 23 05:01:22 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:22 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:22 localhost podman[320171]: 2025-11-23 10:01:22.317832546 +0000 UTC m=+0.061916048 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:22.589 263258 INFO neutron.agent.dhcp.agent [None req-28a6886c-072c-4559-95cd-95b2843586c3 - - - - - -] DHCP configuration for ports {'be42f629-f772-4cb8-bcb6-19f1e43b36b3'} is completed#033[00m Nov 23 05:01:23 localhost podman[320238]: Nov 23 05:01:23 localhost podman[320238]: 2025-11-23 10:01:23.112209248 +0000 UTC m=+0.108938707 container create 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:23 localhost systemd[1]: Started libpod-conmon-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope. Nov 23 05:01:23 localhost podman[320238]: 2025-11-23 10:01:23.057092279 +0000 UTC m=+0.053821818 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:23 localhost systemd[1]: Started libcrun container. Nov 23 05:01:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f0a8345e1c8f1f0588811cc6ec76af6b50a76c7923797df817a5dc7f62282f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:23 localhost podman[320238]: 2025-11-23 10:01:23.197155634 +0000 UTC m=+0.193885103 container init 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:01:23 localhost podman[320238]: 2025-11-23 10:01:23.20613823 +0000 UTC m=+0.202867689 container start 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:01:23 localhost dnsmasq[320256]: started, version 2.85 cachesize 150 Nov 23 05:01:23 localhost dnsmasq[320256]: DNS service limited to local subnets Nov 23 05:01:23 localhost dnsmasq[320256]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:23 localhost dnsmasq[320256]: warning: no upstream servers configured Nov 23 05:01:23 localhost dnsmasq[320256]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:23 localhost sshd[320257]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:01:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:23.462 263258 INFO neutron.agent.dhcp.agent [None req-bf3bb32d-907a-4bf1-b4ff-6b4ebb008875 - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:23 localhost ovn_controller[154788]: 2025-11-23T10:01:23Z|00262|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:23 localhost nova_compute[281952]: 2025-11-23 10:01:23.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.626002) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083626032, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2550, "num_deletes": 259, "total_data_size": 3401241, "memory_usage": 3459920, "flush_reason": "Manual Compaction"} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083637805, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2197040, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21220, "largest_seqno": 23765, "table_properties": {"data_size": 2187848, "index_size": 5697, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20399, "raw_average_key_size": 21, "raw_value_size": 2168831, "raw_average_value_size": 2254, "num_data_blocks": 249, "num_entries": 962, "num_filter_entries": 962, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891914, "oldest_key_time": 1763891914, "file_creation_time": 1763892083, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 11852 microseconds, and 5590 cpu microseconds. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.637850) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2197040 bytes OK Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.637873) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640426) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640448) EVENT_LOG_v1 {"time_micros": 1763892083640442, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.640468) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3389840, prev total WAL file size 3389840, number of live WAL files 2. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.641249) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2145KB)], [36(16MB)] Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083641285, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19079701, "oldest_snapshot_seqno": -1} Nov 23 05:01:23 localhost systemd[1]: tmp-crun.QIEk7H.mount: Deactivated successfully. Nov 23 05:01:23 localhost podman[320275]: 2025-11-23 10:01:23.658864228 +0000 UTC m=+0.059280307 container kill 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:23 localhost dnsmasq[320256]: exiting on receipt of SIGTERM Nov 23 05:01:23 localhost systemd[1]: libpod-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope: Deactivated successfully. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12593 keys, 16078574 bytes, temperature: kUnknown Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083714733, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 16078574, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16006852, "index_size": 39173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 338609, "raw_average_key_size": 26, "raw_value_size": 15792162, "raw_average_value_size": 1254, "num_data_blocks": 1477, "num_entries": 12593, "num_filter_entries": 12593, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892083, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.715250) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 16078574 bytes Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.717151) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 258.7 rd, 218.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 16.1 +0.0 blob) out(15.3 +0.0 blob), read-write-amplify(16.0) write-amplify(7.3) OK, records in: 13126, records dropped: 533 output_compression: NoCompression Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.717180) EVENT_LOG_v1 {"time_micros": 1763892083717168, "job": 20, "event": "compaction_finished", "compaction_time_micros": 73762, "compaction_time_cpu_micros": 43622, "output_level": 6, "num_output_files": 1, "total_output_size": 16078574, "num_input_records": 13126, "num_output_records": 12593, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083718094, "job": 20, "event": "table_file_deletion", "file_number": 38} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892083720584, "job": 20, "event": "table_file_deletion", "file_number": 36} Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.641189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:23.720798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:23 localhost podman[320289]: 2025-11-23 10:01:23.728072769 +0000 UTC m=+0.053447317 container died 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:01:23 localhost podman[320289]: 2025-11-23 10:01:23.759619332 +0000 UTC m=+0.084993840 container cleanup 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:01:23 localhost systemd[1]: libpod-conmon-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d.scope: Deactivated successfully. Nov 23 05:01:23 localhost podman[320291]: 2025-11-23 10:01:23.794239378 +0000 UTC m=+0.115174749 container remove 2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:01:24 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:24.130 2 INFO neutron.agent.securitygroups_rpc [None req-cef856d0-bc0f-421d-b6e3-a61c28d38f99 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:24 localhost systemd[1]: var-lib-containers-storage-overlay-5f0a8345e1c8f1f0588811cc6ec76af6b50a76c7923797df817a5dc7f62282f2-merged.mount: Deactivated successfully. Nov 23 05:01:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b9733ed22fc9f2f1675eb5698d59614221359c163e853b061fa26768d9ddd8d-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:24 localhost nova_compute[281952]: 2025-11-23 10:01:24.336 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:24 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses Nov 23 05:01:24 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:24 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:24 localhost podman[320334]: 2025-11-23 10:01:24.377012961 +0000 UTC m=+0.061127074 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 05:01:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:01:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:01:24 localhost podman[320347]: 2025-11-23 10:01:24.497233584 +0000 UTC m=+0.092496580 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:24 localhost podman[320347]: 2025-11-23 10:01:24.539351322 +0000 UTC m=+0.134614308 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:24 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:01:24 localhost podman[320348]: 2025-11-23 10:01:24.560810323 +0000 UTC m=+0.153964684 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:01:24 localhost podman[320348]: 2025-11-23 10:01:24.568629134 +0000 UTC m=+0.161783505 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:01:24 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:01:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:25.113 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:25.114 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:25.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:25.118 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:25.119 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[842da903-5e54-4a7a-9e4d-ca6032751f25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:25 localhost nova_compute[281952]: 2025-11-23 10:01:25.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:01:25 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:25.880 2 INFO neutron.agent.securitygroups_rpc [None req-2cdcfc76-bcd0-4cc8-b710-4dffd4e83ae1 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:26.002 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:24Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=44238cca-3aef-4386-9b52-e2dbbf93973c, ip_allocation=immediate, mac_address=fa:16:3e:47:a8:3d, name=tempest-AllowedAddressPairIpV6TestJSON-851427365, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1551, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:25Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:26 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses Nov 23 05:01:26 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:26 localhost podman[320439]: 2025-11-23 10:01:26.198578485 +0000 UTC m=+0.055479369 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:26 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:26.427 263258 INFO neutron.agent.dhcp.agent [None req-18f24ecb-a96d-474b-b9ea-376300dd66ae - - - - - -] DHCP configuration for ports {'44238cca-3aef-4386-9b52-e2dbbf93973c'} is completed#033[00m Nov 23 05:01:26 localhost nova_compute[281952]: 2025-11-23 10:01:26.490 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:26 localhost podman[320485]: Nov 23 05:01:26 localhost podman[320485]: 2025-11-23 10:01:26.680186251 +0000 UTC m=+0.077590460 container create cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:26 localhost systemd[1]: Started libpod-conmon-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope. Nov 23 05:01:26 localhost systemd[1]: tmp-crun.3xnVU3.mount: Deactivated successfully. Nov 23 05:01:26 localhost systemd[1]: Started libcrun container. Nov 23 05:01:26 localhost podman[320485]: 2025-11-23 10:01:26.635299948 +0000 UTC m=+0.032704207 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/501fc31257ee79b0f2a66e32bf5adfac954ccbcbdac3591e5a80dde2af71d298/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:26 localhost podman[320485]: 2025-11-23 10:01:26.748060923 +0000 UTC m=+0.145465132 container init cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:26 localhost podman[320485]: 2025-11-23 10:01:26.75706877 +0000 UTC m=+0.154472979 container start cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:01:26 localhost dnsmasq[320503]: started, version 2.85 cachesize 150 Nov 23 05:01:26 localhost dnsmasq[320503]: DNS service limited to local subnets Nov 23 05:01:26 localhost dnsmasq[320503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:26 localhost dnsmasq[320503]: warning: no upstream servers configured Nov 23 05:01:26 localhost dnsmasq-dhcp[320503]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:26 localhost dnsmasq[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:26 localhost dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:26 localhost dnsmasq-dhcp[320503]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:26 localhost ovn_controller[154788]: 2025-11-23T10:01:26Z|00263|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:26 localhost nova_compute[281952]: 2025-11-23 10:01:26.930 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:27.059 263258 INFO neutron.agent.dhcp.agent [None req-3f8f8920-baf4-42cc-ada8-a3603bb1b297 - - - - - -] DHCP configuration for ports {'67c1b2ca-3e93-4592-92bd-bb626f12e09a', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:28 localhost podman[320520]: 2025-11-23 10:01:28.256010566 +0000 UTC m=+0.060011750 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:28 localhost dnsmasq[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/addn_hosts - 0 addresses Nov 23 05:01:28 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/host Nov 23 05:01:28 localhost dnsmasq-dhcp[317960]: read /var/lib/neutron/dhcp/bcc66174-371f-4faf-83f1-5de56d4886ad/opts Nov 23 05:01:28 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:28.343 2 INFO neutron.agent.securitygroups_rpc [None req-ac3982b0-5538-471d-bc5b-e6d4442ddedd fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:28 localhost nova_compute[281952]: 2025-11-23 10:01:28.482 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:28 localhost kernel: device tapb74e35ad-94 left promiscuous mode Nov 23 05:01:28 localhost ovn_controller[154788]: 2025-11-23T10:01:28Z|00264|binding|INFO|Releasing lport b74e35ad-94a2-4d4d-af80-3b2024099e6d from this chassis (sb_readonly=0) Nov 23 05:01:28 localhost ovn_controller[154788]: 2025-11-23T10:01:28Z|00265|binding|INFO|Setting lport b74e35ad-94a2-4d4d-af80-3b2024099e6d down in Southbound Nov 23 05:01:28 localhost nova_compute[281952]: 2025-11-23 10:01:28.498 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:28.501 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bcc66174-371f-4faf-83f1-5de56d4886ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a45cfb38-a270-4929-a93b-8d89273d60d1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b74e35ad-94a2-4d4d-af80-3b2024099e6d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:28.502 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b74e35ad-94a2-4d4d-af80-3b2024099e6d in datapath bcc66174-371f-4faf-83f1-5de56d4886ad unbound from our chassis#033[00m Nov 23 05:01:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:28.504 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bcc66174-371f-4faf-83f1-5de56d4886ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:28.504 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[de1459e9-a989-477d-bcc2-704bacad5eb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:28 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses Nov 23 05:01:28 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:28 localhost podman[320558]: 2025-11-23 10:01:28.559536506 +0000 UTC m=+0.038474337 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:01:28 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:29 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:29.655 2 INFO neutron.agent.securitygroups_rpc [None req-efb808b5-6542-451b-803e-d2906345bde2 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:29.691 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:28Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1715b7fd-d248-410f-8cd9-5fcf625fd736, ip_allocation=immediate, mac_address=fa:16:3e:f8:04:63, name=tempest-AllowedAddressPairIpV6TestJSON-790328342, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1554, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:29Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:29 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses Nov 23 05:01:29 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:29 localhost podman[320610]: 2025-11-23 10:01:29.890359992 +0000 UTC m=+0.060041480 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:01:29 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:29 localhost podman[320623]: 2025-11-23 10:01:29.948414571 +0000 UTC m=+0.065855170 container kill c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:01:29 localhost dnsmasq[317960]: exiting on receipt of SIGTERM Nov 23 05:01:29 localhost systemd[1]: libpod-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope: Deactivated successfully. Nov 23 05:01:29 localhost openstack_network_exporter[242668]: ERROR 10:01:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:01:29 localhost openstack_network_exporter[242668]: ERROR 10:01:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:01:29 localhost openstack_network_exporter[242668]: ERROR 10:01:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:01:29 localhost openstack_network_exporter[242668]: ERROR 10:01:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:01:29 localhost openstack_network_exporter[242668]: Nov 23 05:01:29 localhost openstack_network_exporter[242668]: ERROR 10:01:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:01:29 localhost openstack_network_exporter[242668]: Nov 23 05:01:30 localhost podman[320643]: 2025-11-23 10:01:30.036016039 +0000 UTC m=+0.071029828 container died c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:01:30 localhost podman[320643]: 2025-11-23 10:01:30.066435087 +0000 UTC m=+0.101448786 container cleanup c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:01:30 localhost systemd[1]: libpod-conmon-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736.scope: Deactivated successfully. Nov 23 05:01:30 localhost podman[320645]: 2025-11-23 10:01:30.113204358 +0000 UTC m=+0.139539720 container remove c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bcc66174-371f-4faf-83f1-5de56d4886ad, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.179 263258 INFO neutron.agent.dhcp.agent [None req-2fbb10bd-f0f6-4cc9-8bea-d1d6b8f50c07 - - - - - -] DHCP configuration for ports {'1715b7fd-d248-410f-8cd9-5fcf625fd736'} is completed#033[00m Nov 23 05:01:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.623 263258 INFO neutron.agent.dhcp.agent [None req-605c187b-5ced-4280-a2c8-755324b78819 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:30 localhost systemd[1]: var-lib-containers-storage-overlay-c21502b2347b802ca27076444fd406a39385c92b799f42a0977ebddef8e75100-merged.mount: Deactivated successfully. Nov 23 05:01:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c03ec0fae6314d856ac61576f4c23bcc238741cf5a33bf9cf2ce53b43c5ee736-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:30 localhost systemd[1]: run-netns-qdhcp\x2dbcc66174\x2d371f\x2d4faf\x2d83f1\x2d5de56d4886ad.mount: Deactivated successfully. Nov 23 05:01:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:30.971 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:31 localhost nova_compute[281952]: 2025-11-23 10:01:31.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:31 localhost nova_compute[281952]: 2025-11-23 10:01:31.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:32.908 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:33.734 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:33.736 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:33.739 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 612afa0a-a362-44aa-a6db-9b2f2106f648 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:33.739 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:33.740 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[89a9b992-5bc5-41f7-b90e-56d291534c60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:34 localhost ovn_controller[154788]: 2025-11-23T10:01:34Z|00266|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:34 localhost nova_compute[281952]: 2025-11-23 10:01:34.278 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:34 localhost podman[320690]: 2025-11-23 10:01:34.782033814 +0000 UTC m=+0.043110439 container kill cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:01:34 localhost dnsmasq[320503]: exiting on receipt of SIGTERM Nov 23 05:01:34 localhost systemd[1]: tmp-crun.qXnmFL.mount: Deactivated successfully. Nov 23 05:01:34 localhost systemd[1]: libpod-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope: Deactivated successfully. Nov 23 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:01:34 localhost podman[320703]: 2025-11-23 10:01:34.827289058 +0000 UTC m=+0.035966699 container died cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:01:34 localhost systemd[1]: tmp-crun.PVJ37A.mount: Deactivated successfully. Nov 23 05:01:34 localhost podman[320703]: 2025-11-23 10:01:34.8624354 +0000 UTC m=+0.071113021 container cleanup cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:01:34 localhost systemd[1]: libpod-conmon-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced.scope: Deactivated successfully. Nov 23 05:01:34 localhost podman[320707]: 2025-11-23 10:01:34.930021722 +0000 UTC m=+0.127275641 container remove cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:34 localhost podman[320711]: 2025-11-23 10:01:34.905174907 +0000 UTC m=+0.093397128 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 05:01:34 localhost podman[320711]: 2025-11-23 10:01:34.986470551 +0000 UTC m=+0.174692702 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:01:35 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:01:35 localhost dnsmasq[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/addn_hosts - 0 addresses Nov 23 05:01:35 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/host Nov 23 05:01:35 localhost podman[320774]: 2025-11-23 10:01:35.07698113 +0000 UTC m=+0.043996417 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:01:35 localhost dnsmasq-dhcp[317068]: read /var/lib/neutron/dhcp/30192eb7-6210-4b4d-956f-dbc64d7c0b7c/opts Nov 23 05:01:35 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:35.211 2 INFO neutron.agent.securitygroups_rpc [None req-9e0f6898-e450-4b10-a9b6-2526799f670d fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:35 localhost ovn_controller[154788]: 2025-11-23T10:01:35Z|00267|binding|INFO|Releasing lport 10e1f965-5681-42dc-916c-83e697ea474c from this chassis (sb_readonly=0) Nov 23 05:01:35 localhost ovn_controller[154788]: 2025-11-23T10:01:35Z|00268|binding|INFO|Setting lport 10e1f965-5681-42dc-916c-83e697ea474c down in Southbound Nov 23 05:01:35 localhost nova_compute[281952]: 2025-11-23 10:01:35.283 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:35 localhost kernel: device tap10e1f965-56 left promiscuous mode Nov 23 05:01:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:35.293 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30192eb7-6210-4b4d-956f-dbc64d7c0b7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79509bc833494f3598e01347dc55dea9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10754024-8e92-4669-8b3d-be0210470d0a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10e1f965-5681-42dc-916c-83e697ea474c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:35.296 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 10e1f965-5681-42dc-916c-83e697ea474c in datapath 30192eb7-6210-4b4d-956f-dbc64d7c0b7c unbound from our chassis#033[00m Nov 23 05:01:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:35.299 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30192eb7-6210-4b4d-956f-dbc64d7c0b7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:35 localhost nova_compute[281952]: 2025-11-23 10:01:35.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:35.300 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3abb7f62-d340-40f7-a26f-46b621d0b5a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:35 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses Nov 23 05:01:35 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:35 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:35 localhost podman[320835]: 2025-11-23 10:01:35.493421088 +0000 UTC m=+0.054091997 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:35 localhost dnsmasq[317068]: exiting on receipt of SIGTERM Nov 23 05:01:35 localhost podman[320873]: 2025-11-23 10:01:35.668793771 +0000 UTC m=+0.076254400 container kill b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:01:35 localhost systemd[1]: libpod-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope: Deactivated successfully. Nov 23 05:01:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:01:35 localhost podman[320891]: 2025-11-23 10:01:35.714535789 +0000 UTC m=+0.038467965 container died b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:35 localhost systemd[1]: var-lib-containers-storage-overlay-501fc31257ee79b0f2a66e32bf5adfac954ccbcbdac3591e5a80dde2af71d298-merged.mount: Deactivated successfully. Nov 23 05:01:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cea2d088c8c42fc02ee715bf72f7267671edf34971a3da01df3e4b850972bced-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:35 localhost systemd[1]: var-lib-containers-storage-overlay-0701b612469235a6b7433f4e52ad0b8dcaf36306964dddada708c650d1295ed6-merged.mount: Deactivated successfully. Nov 23 05:01:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:35 localhost podman[320899]: 2025-11-23 10:01:35.78402071 +0000 UTC m=+0.090939602 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:01:35 localhost podman[320891]: 2025-11-23 10:01:35.798169026 +0000 UTC m=+0.122101172 container cleanup b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:01:35 localhost systemd[1]: libpod-conmon-b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f.scope: Deactivated successfully. Nov 23 05:01:35 localhost podman[320898]: 2025-11-23 10:01:35.82849339 +0000 UTC m=+0.132941457 container remove b814926c88b9663ec6410a4e3cea7c6215f0eee0f3002e6c1a96e422d717404f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30192eb7-6210-4b4d-956f-dbc64d7c0b7c, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:01:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:35.855 263258 INFO neutron.agent.dhcp.agent [None req-348dfbab-34a2-4d1b-bc09-7758c516221a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:35.855 263258 INFO neutron.agent.dhcp.agent [None req-348dfbab-34a2-4d1b-bc09-7758c516221a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:35 localhost systemd[1]: run-netns-qdhcp\x2d30192eb7\x2d6210\x2d4b4d\x2d956f\x2ddbc64d7c0b7c.mount: Deactivated successfully. Nov 23 05:01:35 localhost podman[320899]: 2025-11-23 10:01:35.913755796 +0000 UTC m=+0.220674668 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:01:35 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:01:35 localhost podman[320968]: Nov 23 05:01:35 localhost podman[320968]: 2025-11-23 10:01:35.974551169 +0000 UTC m=+0.079536290 container create dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:36 localhost systemd[1]: Started libpod-conmon-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope. Nov 23 05:01:36 localhost ovn_controller[154788]: 2025-11-23T10:01:36Z|00269|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:36 localhost systemd[1]: Started libcrun container. Nov 23 05:01:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ee8801ca6f5ed344b19c84012e21d39d0de6ed956074fb53f69f665de5d851f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:36 localhost podman[320968]: 2025-11-23 10:01:35.930995348 +0000 UTC m=+0.035980549 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:36 localhost podman[320968]: 2025-11-23 10:01:36.041055638 +0000 UTC m=+0.146040799 container init dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:36 localhost nova_compute[281952]: 2025-11-23 10:01:36.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:36 localhost podman[320968]: 2025-11-23 10:01:36.052046947 +0000 UTC m=+0.157032108 container start dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:01:36 localhost dnsmasq[320986]: started, version 2.85 cachesize 150 Nov 23 05:01:36 localhost dnsmasq[320986]: DNS service limited to local subnets Nov 23 05:01:36 localhost dnsmasq[320986]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:36 localhost dnsmasq[320986]: warning: no upstream servers configured Nov 23 05:01:36 localhost dnsmasq-dhcp[320986]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:36 localhost dnsmasq-dhcp[320986]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:36 localhost dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:36 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:36 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:36.225 263258 INFO neutron.agent.dhcp.agent [None req-f5f3ae7d-34d9-45de-a6b8-c1d05bd5e6ed - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', '67c1b2ca-3e93-4592-92bd-bb626f12e09a'} is completed#033[00m Nov 23 05:01:36 localhost nova_compute[281952]: 2025-11-23 10:01:36.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:37 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:37.252 2 INFO neutron.agent.securitygroups_rpc [None req-0d1bae38-c3b0-4ed4-ba2a-9b0a25fae4ab 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:37.379 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:35Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=3ba41a4f-9ca4-4e43-9003-2afe15de07b2, ip_allocation=immediate, mac_address=fa:16:3e:24:37:ae, name=tempest-NetworksTestDHCPv6-1057999249, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['029dda56-7cc4-415f-aff5-e1cbc56c7934', 'c6230737-2289-4ccb-b4a3-ea3c53474e91'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:26Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1567, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:36Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:37 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:37.500 2 INFO neutron.agent.securitygroups_rpc [None req-b60c8ac0-a755-4981-a84e-d95a726ca718 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:37 localhost systemd[1]: tmp-crun.4CSBfO.mount: Deactivated successfully. Nov 23 05:01:37 localhost dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:01:37 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:37 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:37 localhost podman[321058]: 2025-11-23 10:01:37.63565101 +0000 UTC m=+0.048468654 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:37.899 263258 INFO neutron.agent.dhcp.agent [None req-d63b3733-4f6c-4711-a9e5-a7ae0e7a3cf0 - - - - - -] DHCP configuration for ports {'3ba41a4f-9ca4-4e43-9003-2afe15de07b2'} is completed#033[00m Nov 23 05:01:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:38.016 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:36Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=27d54986-817e-4c51-af4c-54db06d3c104, ip_allocation=immediate, mac_address=fa:16:3e:bf:82:90, name=tempest-AllowedAddressPairIpV6TestJSON-526636575, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1569, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:36Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:38 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses Nov 23 05:01:38 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:38 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:38 localhost podman[321131]: 2025-11-23 10:01:38.226856652 +0000 UTC m=+0.074602370 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:01:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:38.471 263258 INFO neutron.agent.dhcp.agent [None req-fd797fa5-df38-4f00-b386-6dbfc563214b - - - - - -] DHCP configuration for ports {'27d54986-817e-4c51-af4c-54db06d3c104'} is completed#033[00m Nov 23 05:01:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:01:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:01:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:01:38 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:38.895 2 INFO neutron.agent.securitygroups_rpc [None req-e4114a4e-94f5-47df-bc1b-6dc35d238db6 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:39 localhost systemd[1]: tmp-crun.myRyxo.mount: Deactivated successfully. Nov 23 05:01:39 localhost dnsmasq[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:39 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:39 localhost dnsmasq-dhcp[320986]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:39 localhost podman[321169]: 2025-11-23 10:01:39.168553001 +0000 UTC m=+0.069864413 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:01:39 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:39.785 2 INFO neutron.agent.securitygroups_rpc [None req-3c7abe8e-704d-455a-8480-13cb405babe1 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']#033[00m Nov 23 05:01:39 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:39.828 2 INFO neutron.agent.securitygroups_rpc [None req-3839d769-9cff-43fb-b0be-05026e050e30 fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:39 localhost nova_compute[281952]: 2025-11-23 10:01:39.848 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:39 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:39.887 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:38Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=956e2e20-c773-4868-a462-57a23da288a3, ip_allocation=immediate, mac_address=fa:16:3e:b0:67:39, name=tempest-AllowedAddressPairIpV6TestJSON-651501806, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:16Z, description=, dns_domain=, id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1503684566, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17000, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1522, status=ACTIVE, subnets=['459378dd-aa61-4895-adc5-f4adec26a6d8'], tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:17Z, vlan_transparent=None, network_id=a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, port_security_enabled=True, project_id=a180dbb035ce42ac9ec3178829ba27ed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec43b846-c0b6-48e2-bcdb-df3dfa286247'], standard_attr_id=1573, status=DOWN, tags=[], tenant_id=a180dbb035ce42ac9ec3178829ba27ed, updated_at=2025-11-23T10:01:39Z on network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee#033[00m Nov 23 05:01:40 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:40.002 2 INFO neutron.agent.securitygroups_rpc [None req-3c7abe8e-704d-455a-8480-13cb405babe1 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']#033[00m Nov 23 05:01:40 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 3 addresses Nov 23 05:01:40 localhost podman[321210]: 2025-11-23 10:01:40.077238594 +0000 UTC m=+0.058831833 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:40 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:40 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:40 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:40.500 263258 INFO neutron.agent.dhcp.agent [None req-a1130b39-ef34-4073-b9dc-16486cf26a18 - - - - - -] DHCP configuration for ports {'956e2e20-c773-4868-a462-57a23da288a3'} is completed#033[00m Nov 23 05:01:40 localhost systemd[1]: tmp-crun.Gp5u93.mount: Deactivated successfully. Nov 23 05:01:40 localhost dnsmasq[320986]: exiting on receipt of SIGTERM Nov 23 05:01:40 localhost podman[321249]: 2025-11-23 10:01:40.686726109 +0000 UTC m=+0.060196285 container kill dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:01:40 localhost systemd[1]: libpod-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope: Deactivated successfully. Nov 23 05:01:40 localhost podman[321265]: 2025-11-23 10:01:40.741515117 +0000 UTC m=+0.039169227 container died dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:01:40 localhost systemd[1]: tmp-crun.dykTOO.mount: Deactivated successfully. Nov 23 05:01:40 localhost podman[321265]: 2025-11-23 10:01:40.78740163 +0000 UTC m=+0.085055700 container remove dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:01:40 localhost systemd[1]: libpod-conmon-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29.scope: Deactivated successfully. Nov 23 05:01:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:41.016 2 INFO neutron.agent.securitygroups_rpc [None req-7b7ae10d-5331-41d7-96b9-7aafc7180edd 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']#033[00m Nov 23 05:01:41 localhost systemd[1]: var-lib-containers-storage-overlay-7ee8801ca6f5ed344b19c84012e21d39d0de6ed956074fb53f69f665de5d851f-merged.mount: Deactivated successfully. Nov 23 05:01:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dcee515ad4bb2ec1ba68e92da1a085b2a7e09fb99a16aff261966595a2e0bb29-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:41.358 2 INFO neutron.agent.securitygroups_rpc [None req-9181f901-83e9-4198-8a73-d33dcd9ef0fc 8d3ccb2bccdf4a12bb3b492992930601 6de614a4ddfd4f868264e9fc1dee856a - - default default] Security group member updated ['980b5aab-daf0-44c4-8e04-21e80ebf2d43']#033[00m Nov 23 05:01:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:41.402 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:41 localhost nova_compute[281952]: 2025-11-23 10:01:41.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:41 localhost podman[321340]: Nov 23 05:01:41 localhost podman[321340]: 2025-11-23 10:01:41.595463324 +0000 UTC m=+0.077072366 container create 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:41 localhost systemd[1]: Started libpod-conmon-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope. Nov 23 05:01:41 localhost systemd[1]: tmp-crun.ouU3Gc.mount: Deactivated successfully. Nov 23 05:01:41 localhost podman[321340]: 2025-11-23 10:01:41.555713058 +0000 UTC m=+0.037322170 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:41 localhost systemd[1]: Started libcrun container. Nov 23 05:01:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4917278dbbdfc63e7fdece56c2d38985a74ee1b2db2810c2d7a7b81ee259f1d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:41 localhost podman[321340]: 2025-11-23 10:01:41.677092798 +0000 UTC m=+0.158701850 container init 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:01:41 localhost podman[321340]: 2025-11-23 10:01:41.685485027 +0000 UTC m=+0.167094069 container start 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:41 localhost dnsmasq[321359]: started, version 2.85 cachesize 150 Nov 23 05:01:41 localhost dnsmasq[321359]: DNS service limited to local subnets Nov 23 05:01:41 localhost dnsmasq[321359]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:41 localhost dnsmasq[321359]: warning: no upstream servers configured Nov 23 05:01:41 localhost dnsmasq-dhcp[321359]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:41 localhost dnsmasq[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:41 localhost dnsmasq-dhcp[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:41 localhost dnsmasq-dhcp[321359]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:41 localhost podman[240668]: time="2025-11-23T10:01:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:01:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:41.909 2 INFO neutron.agent.securitygroups_rpc [None req-7527e00c-fa2d-4f7e-82d1-1aa51478130d fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:41 localhost podman[240668]: @ - - [23/Nov/2025:10:01:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157496 "" "Go-http-client/1.1" Nov 23 05:01:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:41.927 263258 INFO neutron.agent.dhcp.agent [None req-e173d92d-386d-40cf-8b2f-abcc2575f9b0 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', '67c1b2ca-3e93-4592-92bd-bb626f12e09a'} is completed#033[00m Nov 23 05:01:41 localhost podman[240668]: @ - - [23/Nov/2025:10:01:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19712 "" "Go-http-client/1.1" Nov 23 05:01:42 localhost dnsmasq[321359]: exiting on receipt of SIGTERM Nov 23 05:01:42 localhost systemd[1]: libpod-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope: Deactivated successfully. Nov 23 05:01:42 localhost podman[321377]: 2025-11-23 10:01:42.047489468 +0000 UTC m=+0.065380014 container kill 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:01:42 localhost podman[321414]: 2025-11-23 10:01:42.126740099 +0000 UTC m=+0.051733564 container died 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:42 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 2 addresses Nov 23 05:01:42 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:42 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:42 localhost podman[321416]: 2025-11-23 10:01:42.147768578 +0000 UTC m=+0.057525264 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:42 localhost podman[321414]: 2025-11-23 10:01:42.22574756 +0000 UTC m=+0.150740975 container remove 2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:01:42 localhost systemd[1]: libpod-conmon-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a.scope: Deactivated successfully. Nov 23 05:01:42 localhost nova_compute[281952]: 2025-11-23 10:01:42.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:42 localhost ovn_controller[154788]: 2025-11-23T10:01:42Z|00270|binding|INFO|Releasing lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a from this chassis (sb_readonly=0) Nov 23 05:01:42 localhost kernel: device tap67c1b2ca-3e left promiscuous mode Nov 23 05:01:42 localhost ovn_controller[154788]: 2025-11-23T10:01:42Z|00271|binding|INFO|Setting lport 67c1b2ca-3e93-4592-92bd-bb626f12e09a down in Southbound Nov 23 05:01:42 localhost nova_compute[281952]: 2025-11-23 10:01:42.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:42.269 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe16:28e8/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=67c1b2ca-3e93-4592-92bd-bb626f12e09a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:42.271 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 67c1b2ca-3e93-4592-92bd-bb626f12e09a in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:01:42 localhost nova_compute[281952]: 2025-11-23 10:01:42.271 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:42.274 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:42 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:42.275 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[acef254e-4dae-463e-b4f5-4b9f85295b32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:42.484 263258 INFO neutron.agent.dhcp.agent [None req-e403bbe2-eaca-4a73-b431-214dd897fd09 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:42 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:42.739 2 INFO neutron.agent.securitygroups_rpc [None req-02caec08-4fe5-4eae-8f59-0f41df22086a fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:42 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 1 addresses Nov 23 05:01:42 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:42 localhost podman[321474]: 2025-11-23 10:01:42.939333371 +0000 UTC m=+0.061207566 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:42 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:43 localhost systemd[1]: var-lib-containers-storage-overlay-4917278dbbdfc63e7fdece56c2d38985a74ee1b2db2810c2d7a7b81ee259f1d1-merged.mount: Deactivated successfully. Nov 23 05:01:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ab4c5bcf2d3e61cb836816ab55a97bb6b86f9b5c15943b70892eddb8515c51a-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:43 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:01:43 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:43.308 2 INFO neutron.agent.securitygroups_rpc [None req-7ad0c56d-3950-498c-8f5a-35533112ee18 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:43 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:43.662 2 INFO neutron.agent.securitygroups_rpc [None req-6a3f8f10-e09b-4d25-8141-9c205b1a054c fcc428367cfa48b088d781e43c36f195 a180dbb035ce42ac9ec3178829ba27ed - - default default] Security group member updated ['ec43b846-c0b6-48e2-bcdb-df3dfa286247']#033[00m Nov 23 05:01:43 localhost podman[321512]: 2025-11-23 10:01:43.889091379 +0000 UTC m=+0.065644183 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:01:43 localhost dnsmasq[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/addn_hosts - 0 addresses Nov 23 05:01:43 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/host Nov 23 05:01:43 localhost dnsmasq-dhcp[319974]: read /var/lib/neutron/dhcp/a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee/opts Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.231 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 10.100.0.2 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.233 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.234 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.235 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[568dbc5f-5e02-4ed4-bd9a-5570fa630ec5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:44 localhost dnsmasq[319974]: exiting on receipt of SIGTERM Nov 23 05:01:44 localhost podman[321552]: 2025-11-23 10:01:44.344081106 +0000 UTC m=+0.044941956 container kill 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:01:44 localhost systemd[1]: libpod-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope: Deactivated successfully. Nov 23 05:01:44 localhost podman[321567]: 2025-11-23 10:01:44.407636863 +0000 UTC m=+0.050370553 container died 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:44 localhost podman[321567]: 2025-11-23 10:01:44.439450014 +0000 UTC m=+0.082183654 container cleanup 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:44 localhost systemd[1]: libpod-conmon-18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04.scope: Deactivated successfully. Nov 23 05:01:44 localhost podman[321568]: 2025-11-23 10:01:44.491798996 +0000 UTC m=+0.127480068 container remove 18f463b795328a2cc58e36fc9121800b919c465107bccfd8648d014b05a38c04 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:01:44 localhost ovn_controller[154788]: 2025-11-23T10:01:44Z|00272|binding|INFO|Releasing lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 from this chassis (sb_readonly=0) Nov 23 05:01:44 localhost nova_compute[281952]: 2025-11-23 10:01:44.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:44 localhost ovn_controller[154788]: 2025-11-23T10:01:44Z|00273|binding|INFO|Setting lport 9a92ea95-742f-47d6-b9a5-b24454278ac2 down in Southbound Nov 23 05:01:44 localhost kernel: device tap9a92ea95-74 left promiscuous mode Nov 23 05:01:44 localhost nova_compute[281952]: 2025-11-23 10:01:44.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.523 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a180dbb035ce42ac9ec3178829ba27ed', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fbea3ea-671c-4b0c-8df6-a11c66c76ac2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9a92ea95-742f-47d6-b9a5-b24454278ac2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.525 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9a92ea95-742f-47d6-b9a5-b24454278ac2 in datapath a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee unbound from our chassis#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.527 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a401e9c6-07fc-4d23-8b8b-ff6246c4e8ee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:01:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:44.527 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbf5229-e1a9-43b4-beaf-a6fea3b16e4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:44.592 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.186 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.188 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.232 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:01:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:45.276 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:45 localhost podman[321595]: 2025-11-23 10:01:45.318637228 +0000 UTC m=+0.073638660 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7) Nov 23 05:01:45 localhost podman[321595]: 2025-11-23 10:01:45.327370327 +0000 UTC m=+0.082371819 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7) Nov 23 05:01:45 localhost podman[321596]: 2025-11-23 10:01:45.338223822 +0000 UTC m=+0.084907188 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:45 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:01:45 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:45.341 2 INFO neutron.agent.securitygroups_rpc [None req-13ffd066-3677-4907-b0b4-8f11a42c5f7c a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:45 localhost systemd[1]: var-lib-containers-storage-overlay-82de1d62aafaefd90894da078f8a1a469742bc8b5c3eb4a9608ec437320d7a2a-merged.mount: Deactivated successfully. Nov 23 05:01:45 localhost systemd[1]: run-netns-qdhcp\x2da401e9c6\x2d07fc\x2d4d23\x2d8b8b\x2dff6246c4e8ee.mount: Deactivated successfully. Nov 23 05:01:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:45.396 263258 INFO neutron.agent.linux.ip_lib [None req-fec28012-90eb-45f4-8f7f-93722b94c6dd - - - - - -] Device tapbea321e5-56 cannot be used as it has no MAC address#033[00m Nov 23 05:01:45 localhost systemd[1]: tmp-crun.WVgMAK.mount: Deactivated successfully. Nov 23 05:01:45 localhost podman[321594]: 2025-11-23 10:01:45.401102728 +0000 UTC m=+0.156039288 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118) Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost podman[321596]: 2025-11-23 10:01:45.423608891 +0000 UTC m=+0.170292267 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller) Nov 23 05:01:45 localhost kernel: device tapbea321e5-56 entered promiscuous mode Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost NetworkManager[5975]: [1763892105.4271] manager: (tapbea321e5-56): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Nov 23 05:01:45 localhost ovn_controller[154788]: 2025-11-23T10:01:45Z|00274|binding|INFO|Claiming lport bea321e5-5628-434f-911b-989bbbc4badb for this chassis. Nov 23 05:01:45 localhost ovn_controller[154788]: 2025-11-23T10:01:45Z|00275|binding|INFO|bea321e5-5628-434f-911b-989bbbc4badb: Claiming unknown Nov 23 05:01:45 localhost systemd-udevd[321661]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:45 localhost ovn_controller[154788]: 2025-11-23T10:01:45Z|00276|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb ovn-installed in OVS Nov 23 05:01:45 localhost podman[321594]: 2025-11-23 10:01:45.438785019 +0000 UTC m=+0.193721589 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:01:45 localhost ovn_controller[154788]: 2025-11-23T10:01:45Z|00277|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb up in Southbound Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.446 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe03:fd57/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bea321e5-5628-434f-911b-989bbbc4badb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.452 160439 INFO neutron.agent.ovn.metadata.agent [-] Port bea321e5-5628-434f-911b-989bbbc4badb in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.457 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port b747372d-b0df-4741-9567-1269323d1ac6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.458 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:45 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:45.458 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8a1b7b-656b-45b5-8d2c-4443aa4b06b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost journal[230249]: ethtool ioctl error on tapbea321e5-56: No such device Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost nova_compute[281952]: 2025-11-23 10:01:45.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:45 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:45.921 2 INFO neutron.agent.securitygroups_rpc [None req-b9b41fb6-e2e7-4179-8f08-7b97a3dfa0c7 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:46.189 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:01:46 localhost ovn_controller[154788]: 2025-11-23T10:01:46Z|00278|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:46 localhost nova_compute[281952]: 2025-11-23 10:01:46.264 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:46 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:46.294 2 INFO neutron.agent.securitygroups_rpc [None req-f7529e1b-3bf3-41ad-a49e-e39cf58ffefa ca36e3c530cd4996add76add048683eb 461e34582027490ebd34279a384a57b1 - - default default] Security group rule updated ['ce47e028-f950-480c-a113-98c15c008254']#033[00m Nov 23 05:01:46 localhost podman[321733]: Nov 23 05:01:46 localhost podman[321733]: 2025-11-23 10:01:46.329389054 +0000 UTC m=+0.082006487 container create 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:01:46 localhost systemd[1]: Started libpod-conmon-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope. Nov 23 05:01:46 localhost podman[321733]: 2025-11-23 10:01:46.292641222 +0000 UTC m=+0.045258625 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:46 localhost systemd[1]: Started libcrun container. Nov 23 05:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/169696af7c6c322034c0f6d82a43613a1e0ee2554520eba53c27cf09c69abb80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:46 localhost podman[321733]: 2025-11-23 10:01:46.4094359 +0000 UTC m=+0.162053323 container init 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:01:46 localhost podman[321733]: 2025-11-23 10:01:46.41886171 +0000 UTC m=+0.171479103 container start 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:01:46 localhost dnsmasq[321750]: started, version 2.85 cachesize 150 Nov 23 05:01:46 localhost dnsmasq[321750]: DNS service limited to local subnets Nov 23 05:01:46 localhost dnsmasq[321750]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:46 localhost dnsmasq[321750]: warning: no upstream servers configured Nov 23 05:01:46 localhost dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.470 263258 INFO neutron.agent.dhcp.agent [None req-fec28012-90eb-45f4-8f7f-93722b94c6dd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:44Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ef912ada-86b5-4d28-bae3-e33192c6b57f, ip_allocation=immediate, mac_address=fa:16:3e:6e:a9:1d, name=tempest-NetworksTestDHCPv6-1699018063, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['6377934c-bad8-4c3b-bfeb-7ba8e9254091'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:43Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1592, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:45Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:46 localhost nova_compute[281952]: 2025-11-23 10:01:46.541 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.633 263258 INFO neutron.agent.dhcp.agent [None req-ea2f1b92-8901-42f8-bc68-c42b0ef2f9e9 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:46 localhost dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:01:46 localhost podman[321769]: 2025-11-23 10:01:46.657628887 +0000 UTC m=+0.068102780 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:46.994 263258 INFO neutron.agent.dhcp.agent [None req-d0563eee-ea94-474f-a1b4-e5316e44bb3d - - - - - -] DHCP configuration for ports {'ef912ada-86b5-4d28-bae3-e33192c6b57f'} is completed#033[00m Nov 23 05:01:47 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:47.256 2 INFO neutron.agent.securitygroups_rpc [None req-cfd83c6e-245c-4505-a8d2-3c8b7de44cbd 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:47 localhost systemd[1]: tmp-crun.CIe3oh.mount: Deactivated successfully. Nov 23 05:01:47 localhost dnsmasq[321750]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:47 localhost podman[321805]: 2025-11-23 10:01:47.485980534 +0000 UTC m=+0.061640290 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.611366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107611403, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 538, "num_deletes": 257, "total_data_size": 416165, "memory_usage": 427544, "flush_reason": "Manual Compaction"} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107616550, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 271753, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23770, "largest_seqno": 24303, "table_properties": {"data_size": 269139, "index_size": 661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6592, "raw_average_key_size": 18, "raw_value_size": 263608, "raw_average_value_size": 740, "num_data_blocks": 30, "num_entries": 356, "num_filter_entries": 356, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892085, "oldest_key_time": 1763892085, "file_creation_time": 1763892107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 5240 microseconds, and 2045 cpu microseconds. Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.616601) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 271753 bytes OK Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.616626) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618879) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618902) EVENT_LOG_v1 {"time_micros": 1763892107618895, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.618956) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 412982, prev total WAL file size 413306, number of live WAL files 2. Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.619693) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303234' seq:72057594037927935, type:22 .. '6C6F676D0034323737' seq:0, type:0; will stop at (end) Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(265KB)], [39(15MB)] Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107619747, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16350327, "oldest_snapshot_seqno": -1} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12419 keys, 16030469 bytes, temperature: kUnknown Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107714272, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 16030469, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15959705, "index_size": 38599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31109, "raw_key_size": 335963, "raw_average_key_size": 27, "raw_value_size": 15747950, "raw_average_value_size": 1268, "num_data_blocks": 1448, "num_entries": 12419, "num_filter_entries": 12419, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.714613) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 16030469 bytes Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.716383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.8 rd, 169.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 15.3 +0.0 blob) out(15.3 +0.0 blob), read-write-amplify(119.2) write-amplify(59.0) OK, records in: 12949, records dropped: 530 output_compression: NoCompression Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.716411) EVENT_LOG_v1 {"time_micros": 1763892107716399, "job": 22, "event": "compaction_finished", "compaction_time_micros": 94631, "compaction_time_cpu_micros": 44019, "output_level": 6, "num_output_files": 1, "total_output_size": 16030469, "num_input_records": 12949, "num_output_records": 12419, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107716586, "job": 22, "event": "table_file_deletion", "file_number": 41} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892107718967, "job": 22, "event": "table_file_deletion", "file_number": 39} Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.619572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719087) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:01:47.719090) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:01:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:48 localhost podman[321843]: 2025-11-23 10:01:48.950283703 +0000 UTC m=+0.060916028 container kill 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:01:48 localhost dnsmasq[321750]: exiting on receipt of SIGTERM Nov 23 05:01:48 localhost systemd[1]: libpod-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope: Deactivated successfully. Nov 23 05:01:49 localhost podman[321857]: 2025-11-23 10:01:49.020269609 +0000 UTC m=+0.050687893 container died 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:01:49 localhost podman[321857]: 2025-11-23 10:01:49.051601254 +0000 UTC m=+0.082019458 container cleanup 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:49 localhost systemd[1]: libpod-conmon-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d.scope: Deactivated successfully. Nov 23 05:01:49 localhost podman[321858]: 2025-11-23 10:01:49.093400122 +0000 UTC m=+0.123203397 container remove 3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:01:49 localhost ovn_controller[154788]: 2025-11-23T10:01:49Z|00279|binding|INFO|Releasing lport bea321e5-5628-434f-911b-989bbbc4badb from this chassis (sb_readonly=0) Nov 23 05:01:49 localhost kernel: device tapbea321e5-56 left promiscuous mode Nov 23 05:01:49 localhost nova_compute[281952]: 2025-11-23 10:01:49.104 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:49 localhost ovn_controller[154788]: 2025-11-23T10:01:49Z|00280|binding|INFO|Setting lport bea321e5-5628-434f-911b-989bbbc4badb down in Southbound Nov 23 05:01:49 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:49.117 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe03:fd57/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bea321e5-5628-434f-911b-989bbbc4badb) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:49 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:49.119 160439 INFO neutron.agent.ovn.metadata.agent [-] Port bea321e5-5628-434f-911b-989bbbc4badb in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:01:49 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:49.122 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:49 localhost nova_compute[281952]: 2025-11-23 10:01:49.122 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:49 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:49.122 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[66ac9ca1-8648-4532-82f4-e9afe9a6f359]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:49.386 263258 INFO neutron.agent.dhcp.agent [None req-65a49057-2d3e-4771-bb73-17afb6891ac6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:49 localhost systemd[1]: var-lib-containers-storage-overlay-169696af7c6c322034c0f6d82a43613a1e0ee2554520eba53c27cf09c69abb80-merged.mount: Deactivated successfully. Nov 23 05:01:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d3eae02b0b77a991a402c40d60f82a71e4795860d4602b252982488b701b54d-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:49 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:01:51 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:51.013 2 INFO neutron.agent.securitygroups_rpc [None req-fcaf6d85-3067-425b-90fc-65fb17c22c5c 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:51.433 263258 INFO neutron.agent.linux.ip_lib [None req-019f317e-bd0f-4094-9bd5-c52c775285c2 - - - - - -] Device tap9e1df90a-d3 cannot be used as it has no MAC address#033[00m Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost kernel: device tap9e1df90a-d3 entered promiscuous mode Nov 23 05:01:51 localhost systemd-udevd[321894]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:51 localhost NetworkManager[5975]: [1763892111.5143] manager: (tap9e1df90a-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost ovn_controller[154788]: 2025-11-23T10:01:51Z|00281|binding|INFO|Claiming lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 for this chassis. Nov 23 05:01:51 localhost ovn_controller[154788]: 2025-11-23T10:01:51Z|00282|binding|INFO|9e1df90a-d3d3-4908-848e-a0d5f6a57103: Claiming unknown Nov 23 05:01:51 localhost ovn_controller[154788]: 2025-11-23T10:01:51Z|00283|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 ovn-installed in OVS Nov 23 05:01:51 localhost ovn_controller[154788]: 2025-11-23T10:01:51Z|00284|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 up in Southbound Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.526 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:51.530 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe08:dc73/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9e1df90a-d3d3-4908-848e-a0d5f6a57103) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:51.533 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1df90a-d3d3-4908-848e-a0d5f6a57103 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:01:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:51.539 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port a4cbf391-da96-4b48-84ca-e56fadda95bf IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:51.540 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:51.541 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a01d95-5693-456e-877d-87e15919af79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost journal[230249]: ethtool ioctl error on tap9e1df90a-d3: No such device Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.632 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:51 localhost ovn_controller[154788]: 2025-11-23T10:01:51Z|00285|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:01:51 localhost nova_compute[281952]: 2025-11-23 10:01:51.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:52 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:52.185 2 INFO neutron.agent.securitygroups_rpc [None req-b422b6dc-3b22-4323-bd62-6ed72320b39a 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:52 localhost podman[321964]: Nov 23 05:01:52 localhost podman[321964]: 2025-11-23 10:01:52.4527918 +0000 UTC m=+0.097515376 container create 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:52 localhost systemd[1]: Started libpod-conmon-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope. Nov 23 05:01:52 localhost systemd[1]: tmp-crun.P22Rh1.mount: Deactivated successfully. Nov 23 05:01:52 localhost podman[321964]: 2025-11-23 10:01:52.407806634 +0000 UTC m=+0.052530230 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:52 localhost systemd[1]: Started libcrun container. Nov 23 05:01:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d063cd5b145b767409251f60588cb8b0fd1438cff8445bf44f62f5e0c43fc386/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:52 localhost podman[321964]: 2025-11-23 10:01:52.519832125 +0000 UTC m=+0.164555701 container init 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:01:52 localhost podman[321964]: 2025-11-23 10:01:52.529531174 +0000 UTC m=+0.174254750 container start 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:01:52 localhost dnsmasq[321982]: started, version 2.85 cachesize 150 Nov 23 05:01:52 localhost dnsmasq[321982]: DNS service limited to local subnets Nov 23 05:01:52 localhost dnsmasq[321982]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:52 localhost dnsmasq[321982]: warning: no upstream servers configured Nov 23 05:01:52 localhost dnsmasq-dhcp[321982]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:01:52 localhost dnsmasq[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:52 localhost dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:52 localhost dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.586 263258 INFO neutron.agent.dhcp.agent [None req-019f317e-bd0f-4094-9bd5-c52c775285c2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:50Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1d695bde-d397-4fcd-9710-c54ea9a58d0a, ip_allocation=immediate, mac_address=fa:16:3e:60:5c:31, name=tempest-NetworksTestDHCPv6-1597934444, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['042ef781-533a-42ed-9c75-26c4d12a8424'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:49Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1624, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:50Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.728 263258 INFO neutron.agent.dhcp.agent [None req-902c09cc-c236-4810-b741-d773e14881ae - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:52 localhost dnsmasq[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:01:52 localhost dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:01:52 localhost dnsmasq-dhcp[321982]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:01:52 localhost podman[321999]: 2025-11-23 10:01:52.767476644 +0000 UTC m=+0.057700479 container kill 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:01:52 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:52.955 263258 INFO neutron.agent.dhcp.agent [None req-ecf7f6d3-88b3-429b-8e12-1e120000a66d - - - - - -] DHCP configuration for ports {'1d695bde-d397-4fcd-9710-c54ea9a58d0a'} is completed#033[00m Nov 23 05:01:53 localhost dnsmasq[321982]: exiting on receipt of SIGTERM Nov 23 05:01:53 localhost podman[322037]: 2025-11-23 10:01:53.159769869 +0000 UTC m=+0.060355521 container kill 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:53 localhost systemd[1]: libpod-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope: Deactivated successfully. Nov 23 05:01:53 localhost podman[322049]: 2025-11-23 10:01:53.232734857 +0000 UTC m=+0.060538447 container died 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:01:53 localhost podman[322049]: 2025-11-23 10:01:53.266536808 +0000 UTC m=+0.094340358 container cleanup 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:01:53 localhost systemd[1]: libpod-conmon-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a.scope: Deactivated successfully. Nov 23 05:01:53 localhost podman[322056]: 2025-11-23 10:01:53.309302605 +0000 UTC m=+0.128023785 container remove 679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:53 localhost ovn_controller[154788]: 2025-11-23T10:01:53Z|00286|binding|INFO|Releasing lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 from this chassis (sb_readonly=0) Nov 23 05:01:53 localhost kernel: device tap9e1df90a-d3 left promiscuous mode Nov 23 05:01:53 localhost ovn_controller[154788]: 2025-11-23T10:01:53Z|00287|binding|INFO|Setting lport 9e1df90a-d3d3-4908-848e-a0d5f6a57103 down in Southbound Nov 23 05:01:53 localhost nova_compute[281952]: 2025-11-23 10:01:53.353 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:53.361 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe08:dc73/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9e1df90a-d3d3-4908-848e-a0d5f6a57103) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:53.363 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1df90a-d3d3-4908-848e-a0d5f6a57103 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:01:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:53.366 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:53.367 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe581da-f6a7-4902-a066-fbff417315f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:53 localhost nova_compute[281952]: 2025-11-23 10:01:53.379 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:53 localhost systemd[1]: var-lib-containers-storage-overlay-d063cd5b145b767409251f60588cb8b0fd1438cff8445bf44f62f5e0c43fc386-merged.mount: Deactivated successfully. Nov 23 05:01:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-679a2ecb6d281cbb0e2d6074c694e4115208c69b2d58a61da2d6402a1359b54a-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:53 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:53.542 2 INFO neutron.agent.securitygroups_rpc [None req-76b8a8df-4b24-4290-be38-6011d037c5af a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:53 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:53.854 263258 INFO neutron.agent.dhcp.agent [None req-15fe0042-9576-47da-9ccb-4f21713ef5bd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:53 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:53.855 263258 INFO neutron.agent.dhcp.agent [None req-15fe0042-9576-47da-9ccb-4f21713ef5bd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:53 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:01:54 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:54.016 2 INFO neutron.agent.securitygroups_rpc [None req-988e37e0-0049-4c8c-be99-30016558f502 a02463dab01b4a318fbc9bb3ebbc0c3f a95d56ceca02400bb048e86377bec83f - - default default] Security group member updated ['b05f7049-06d0-4552-94d8-97f623373332']#033[00m Nov 23 05:01:54 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:54.204 263258 INFO neutron.agent.linux.ip_lib [None req-b456a874-9c5d-4705-83ba-28876b48e72b - - - - - -] Device tap911f7f8a-f8 cannot be used as it has no MAC address#033[00m Nov 23 05:01:54 localhost nova_compute[281952]: 2025-11-23 10:01:54.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:54 localhost kernel: device tap911f7f8a-f8 entered promiscuous mode Nov 23 05:01:54 localhost NetworkManager[5975]: [1763892114.2341] manager: (tap911f7f8a-f8): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Nov 23 05:01:54 localhost systemd-udevd[321896]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:01:54 localhost ovn_controller[154788]: 2025-11-23T10:01:54Z|00288|binding|INFO|Claiming lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b for this chassis. Nov 23 05:01:54 localhost nova_compute[281952]: 2025-11-23 10:01:54.236 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:54 localhost ovn_controller[154788]: 2025-11-23T10:01:54Z|00289|binding|INFO|911f7f8a-f8a0-4ea9-8b79-546ce102d99b: Claiming unknown Nov 23 05:01:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:54.248 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-758e8da2-ad0b-4400-add6-179377986387', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-758e8da2-ad0b-4400-add6-179377986387', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb9791d358174b77957d83c427c41282', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e94e65-0414-426a-b507-976914cc35fa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=911f7f8a-f8a0-4ea9-8b79-546ce102d99b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:54.249 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 911f7f8a-f8a0-4ea9-8b79-546ce102d99b in datapath 758e8da2-ad0b-4400-add6-179377986387 bound to our chassis#033[00m Nov 23 05:01:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:54.250 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 758e8da2-ad0b-4400-add6-179377986387 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:01:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:54.250 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a7916074-87ef-4afc-ba53-d68658697c0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:54 localhost ovn_controller[154788]: 2025-11-23T10:01:54Z|00290|binding|INFO|Setting lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b ovn-installed in OVS Nov 23 05:01:54 localhost ovn_controller[154788]: 2025-11-23T10:01:54Z|00291|binding|INFO|Setting lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b up in Southbound Nov 23 05:01:54 localhost nova_compute[281952]: 2025-11-23 10:01:54.276 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:54 localhost nova_compute[281952]: 2025-11-23 10:01:54.304 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:54 localhost nova_compute[281952]: 2025-11-23 10:01:54.331 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:01:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:01:55 localhost systemd[1]: tmp-crun.maFKg3.mount: Deactivated successfully. Nov 23 05:01:55 localhost podman[322119]: 2025-11-23 10:01:55.03512571 +0000 UTC m=+0.088790167 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:01:55 localhost podman[322119]: 2025-11-23 10:01:55.073319767 +0000 UTC m=+0.126984244 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:01:55 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:01:55 localhost podman[322118]: 2025-11-23 10:01:55.083707746 +0000 UTC m=+0.136365121 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd) Nov 23 05:01:55 localhost podman[322118]: 2025-11-23 10:01:55.219254642 +0000 UTC m=+0.271912027 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 05:01:55 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:01:55 localhost podman[322182]: Nov 23 05:01:55 localhost podman[322182]: 2025-11-23 10:01:55.24613467 +0000 UTC m=+0.085479064 container create 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:55 localhost systemd[1]: Started libpod-conmon-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope. Nov 23 05:01:55 localhost systemd[1]: Started libcrun container. Nov 23 05:01:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53a9620dc369a622449f987892c4bdf312e7e2ee540f9f927180e16e47bb5267/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:55 localhost podman[322182]: 2025-11-23 10:01:55.300521906 +0000 UTC m=+0.139866270 container init 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:55 localhost podman[322182]: 2025-11-23 10:01:55.209558943 +0000 UTC m=+0.048903337 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:55 localhost podman[322182]: 2025-11-23 10:01:55.309057229 +0000 UTC m=+0.148401593 container start 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:55 localhost dnsmasq[322200]: started, version 2.85 cachesize 150 Nov 23 05:01:55 localhost dnsmasq[322200]: DNS service limited to local subnets Nov 23 05:01:55 localhost dnsmasq[322200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:55 localhost dnsmasq[322200]: warning: no upstream servers configured Nov 23 05:01:55 localhost dnsmasq-dhcp[322200]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:01:55 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 0 addresses Nov 23 05:01:55 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:55 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:55.418 263258 INFO neutron.agent.dhcp.agent [None req-a5a0a7e3-a14c-423e-af9d-86d86a12402d - - - - - -] DHCP configuration for ports {'7f83e24c-3fc0-4b80-9977-bf2ab2093c87'} is completed#033[00m Nov 23 05:01:56 localhost systemd[1]: tmp-crun.868tbO.mount: Deactivated successfully. Nov 23 05:01:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:56.444 263258 INFO neutron.agent.linux.ip_lib [None req-78c8dcf5-78ff-474b-93c3-aea8a513c1f7 - - - - - -] Device tap1a9e922d-b6 cannot be used as it has no MAC address#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.514 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost kernel: device tap1a9e922d-b6 entered promiscuous mode Nov 23 05:01:56 localhost NetworkManager[5975]: [1763892116.5228] manager: (tap1a9e922d-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.525 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost ovn_controller[154788]: 2025-11-23T10:01:56Z|00292|binding|INFO|Claiming lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 for this chassis. Nov 23 05:01:56 localhost ovn_controller[154788]: 2025-11-23T10:01:56Z|00293|binding|INFO|1a9e922d-b6d2-47d7-9477-97162b14a8c2: Claiming unknown Nov 23 05:01:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:56.532 2 INFO neutron.agent.securitygroups_rpc [None req-dca3126a-73f1-4b2f-a026-193f4196c9b1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:56 localhost ovn_controller[154788]: 2025-11-23T10:01:56Z|00294|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 ovn-installed in OVS Nov 23 05:01:56 localhost ovn_controller[154788]: 2025-11-23T10:01:56Z|00295|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 up in Southbound Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.535 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:56.536 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec5:9d71/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1a9e922d-b6d2-47d7-9477-97162b14a8c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:56.540 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9e922d-b6d2-47d7-9477-97162b14a8c2 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:01:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:56.545 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7bde6d51-54df-41c2-ad71-83a30ea62aca IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:01:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:56.545 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:56.546 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8b9d2913-af52-4d51-8465-38927f6a7f3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.609 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost nova_compute[281952]: 2025-11-23 10:01:56.646 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:56.966 2 INFO neutron.agent.securitygroups_rpc [None req-7d98cf13-9412-4d6e-887c-62597fa6d091 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']#033[00m Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.041 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=adfaba6e-566a-4b11-ba70-5bd47e5b5e70, ip_allocation=immediate, mac_address=fa:16:3e:2b:d5:66, name=tempest-ExtraDHCPOptionsTestJSON-1593894496, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:51Z, description=, dns_domain=, id=758e8da2-ad0b-4400-add6-179377986387, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-572811970, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44870, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1630, status=ACTIVE, subnets=['f0dbf111-ccce-4dbd-b3f1-dd923a995597'], tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:52Z, vlan_transparent=None, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1665, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:56Z on network 758e8da2-ad0b-4400-add6-179377986387#033[00m Nov 23 05:01:57 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses Nov 23 05:01:57 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:57 localhost podman[322259]: 2025-11-23 10:01:57.305497739 +0000 UTC m=+0.059979798 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:01:57 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.310 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:57 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:57.420 2 INFO neutron.agent.securitygroups_rpc [None req-bcaf3ac0-9e11-402c-a86f-10a7b58b202d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:01:57 localhost podman[322302]: Nov 23 05:01:57 localhost podman[322302]: 2025-11-23 10:01:57.538731655 +0000 UTC m=+0.090260023 container create 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:01:57 localhost systemd[1]: Started libpod-conmon-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope. Nov 23 05:01:57 localhost systemd[1]: tmp-crun.nv6jM4.mount: Deactivated successfully. Nov 23 05:01:57 localhost podman[322302]: 2025-11-23 10:01:57.493962625 +0000 UTC m=+0.045491053 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:01:57 localhost systemd[1]: Started libcrun container. Nov 23 05:01:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d87b3fb9068fe77c6a0a8dd82cc1218221b11144b046412f21ec7c3919a34f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:01:57 localhost podman[322302]: 2025-11-23 10:01:57.623569998 +0000 UTC m=+0.175098356 container init 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.629 263258 INFO neutron.agent.dhcp.agent [None req-ad127182-d28a-48d1-a1ca-789913d793a5 - - - - - -] DHCP configuration for ports {'adfaba6e-566a-4b11-ba70-5bd47e5b5e70'} is completed#033[00m Nov 23 05:01:57 localhost podman[322302]: 2025-11-23 10:01:57.632613846 +0000 UTC m=+0.184142204 container start 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:01:57 localhost dnsmasq[322322]: started, version 2.85 cachesize 150 Nov 23 05:01:57 localhost dnsmasq[322322]: DNS service limited to local subnets Nov 23 05:01:57 localhost dnsmasq[322322]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:01:57 localhost dnsmasq[322322]: warning: no upstream servers configured Nov 23 05:01:57 localhost dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.694 263258 INFO neutron.agent.dhcp.agent [None req-78c8dcf5-78ff-474b-93c3-aea8a513c1f7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6e16e3f9-6a29-4b10-8e83-0542549b123f, ip_allocation=immediate, mac_address=fa:16:3e:d4:2c:f5, name=tempest-NetworksTestDHCPv6-105209193, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['7c1d7f06-fd27-4e33-82bf-f56ea0e372a0'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:53Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1659, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:01:56Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.765 263258 INFO neutron.agent.dhcp.agent [None req-b2d156fd-497c-4709-9e61-56285a3424ba - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:01:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:01:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2279 writes, 24K keys, 2279 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s#012Cumulative WAL: 2279 writes, 2279 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2279 writes, 24K keys, 2279 commit groups, 1.0 writes per commit group, ingest: 41.37 MB, 0.07 MB/s#012Interval WAL: 2279 writes, 2279 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 194.8 0.16 0.08 11 0.015 0 0 0.0 0.0#012 L6 1/0 15.29 MB 0.0 0.2 0.0 0.1 0.2 0.0 0.0 5.2 227.2 206.7 0.81 0.45 10 0.081 124K 5047 0.0 0.0#012 Sum 1/0 15.29 MB 0.0 0.2 0.0 0.1 0.2 0.0 0.0 6.2 188.9 204.7 0.97 0.52 21 0.046 124K 5047 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.2 0.0 0.1 0.2 0.0 0.0 6.2 189.5 205.3 0.97 0.52 20 0.048 124K 5047 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.1 0.2 0.0 0.0 0.0 227.2 206.7 0.81 0.45 10 0.081 124K 5047 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 198.1 0.16 0.08 10 0.016 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.031, interval 0.031#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.33 MB/s write, 0.18 GB read, 0.31 MB/s read, 1.0 seconds#012Interval compaction: 0.19 GB write, 0.33 MB/s write, 0.18 GB read, 0.31 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 308.00 MB usage: 17.93 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000113 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(889,17.12 MB,5.5569%) FilterBlock(21,369.11 KB,0.117032%) IndexBlock(21,470.17 KB,0.149075%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 23 05:01:57 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:57.855 2 INFO neutron.agent.securitygroups_rpc [None req-640caa93-9dd6-4d73-895a-6c48cb53e831 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']#033[00m Nov 23 05:01:57 localhost dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:01:57 localhost podman[322341]: 2025-11-23 10:01:57.883540126 +0000 UTC m=+0.059253296 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:01:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:57.887 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=006191dd-6d3f-4d8a-bc53-137fd3dda03f, ip_allocation=immediate, mac_address=fa:16:3e:41:23:16, name=tempest-ExtraDHCPOptionsTestJSON-716825327, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:01:51Z, description=, dns_domain=, id=758e8da2-ad0b-4400-add6-179377986387, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-572811970, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44870, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1630, status=ACTIVE, subnets=['f0dbf111-ccce-4dbd-b3f1-dd923a995597'], tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:52Z, vlan_transparent=None, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1666, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:57Z on network 758e8da2-ad0b-4400-add6-179377986387#033[00m Nov 23 05:01:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.091 263258 INFO neutron.agent.dhcp.agent [None req-cdd49f53-a920-4b5a-8e68-3506f1547c4c - - - - - -] DHCP configuration for ports {'6e16e3f9-6a29-4b10-8e83-0542549b123f'} is completed#033[00m Nov 23 05:01:58 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 2 addresses Nov 23 05:01:58 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:58 localhost podman[322381]: 2025-11-23 10:01:58.194383822 +0000 UTC m=+0.105737528 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:01:58 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:58 localhost dnsmasq[322322]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:01:58 localhost podman[322403]: 2025-11-23 10:01:58.22741949 +0000 UTC m=+0.064154677 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:01:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:01:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.461 263258 INFO neutron.agent.dhcp.agent [None req-8465bbf6-1c54-4086-a7f4-b51f82a7d2b5 - - - - - -] DHCP configuration for ports {'006191dd-6d3f-4d8a-bc53-137fd3dda03f'} is completed#033[00m Nov 23 05:01:58 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:58.481 2 INFO neutron.agent.securitygroups_rpc [None req-00590570-c613-4100-977f-94a0fcdda7a2 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']#033[00m Nov 23 05:01:58 localhost dnsmasq[322322]: exiting on receipt of SIGTERM Nov 23 05:01:58 localhost podman[322459]: 2025-11-23 10:01:58.674530003 +0000 UTC m=+0.066241131 container kill 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:01:58 localhost systemd[1]: tmp-crun.T48bJu.mount: Deactivated successfully. Nov 23 05:01:58 localhost systemd[1]: libpod-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope: Deactivated successfully. Nov 23 05:01:58 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses Nov 23 05:01:58 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:58 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:58 localhost podman[322477]: 2025-11-23 10:01:58.747110739 +0000 UTC m=+0.088033763 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:01:58 localhost podman[322493]: 2025-11-23 10:01:58.791334441 +0000 UTC m=+0.093949055 container died 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:01:58 localhost podman[322493]: 2025-11-23 10:01:58.83773482 +0000 UTC m=+0.140349404 container remove 7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:01:58 localhost systemd[1]: libpod-conmon-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a.scope: Deactivated successfully. Nov 23 05:01:58 localhost ovn_controller[154788]: 2025-11-23T10:01:58Z|00296|binding|INFO|Releasing lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 from this chassis (sb_readonly=0) Nov 23 05:01:58 localhost kernel: device tap1a9e922d-b6 left promiscuous mode Nov 23 05:01:58 localhost ovn_controller[154788]: 2025-11-23T10:01:58Z|00297|binding|INFO|Setting lport 1a9e922d-b6d2-47d7-9477-97162b14a8c2 down in Southbound Nov 23 05:01:58 localhost nova_compute[281952]: 2025-11-23 10:01:58.855 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:58.865 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec5:9d71/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1a9e922d-b6d2-47d7-9477-97162b14a8c2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:01:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:58.867 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9e922d-b6d2-47d7-9477-97162b14a8c2 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:01:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:58.869 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:01:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:01:58.870 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7c232b5f-dddb-41bd-b27c-c48ec392ff14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:01:58 localhost nova_compute[281952]: 2025-11-23 10:01:58.873 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:01:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:58.909 263258 INFO neutron.agent.dhcp.agent [None req-a4c642f5-7cf9-48d1-abe1-ff098b5c01d8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:01:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=adfaba6e-566a-4b11-ba70-5bd47e5b5e70, ip_allocation=immediate, mac_address=fa:16:3e:2b:d5:66, name=tempest-new-port-name-306650585, network_id=758e8da2-ad0b-4400-add6-179377986387, port_security_enabled=True, project_id=fb9791d358174b77957d83c427c41282, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7'], standard_attr_id=1665, status=DOWN, tags=[], tenant_id=fb9791d358174b77957d83c427c41282, updated_at=2025-11-23T10:01:58Z on network 758e8da2-ad0b-4400-add6-179377986387#033[00m Nov 23 05:01:59 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 1 addresses Nov 23 05:01:59 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:59 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:59 localhost podman[322545]: 2025-11-23 10:01:59.141401165 +0000 UTC m=+0.063951711 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:01:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:59.195 263258 INFO neutron.agent.dhcp.agent [None req-62da2b23-e1a6-4f95-9c02-432067fbf0e6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:01:59 localhost systemd[1]: var-lib-containers-storage-overlay-0d87b3fb9068fe77c6a0a8dd82cc1218221b11144b046412f21ec7c3919a34f0-merged.mount: Deactivated successfully. Nov 23 05:01:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b4961e30080bc8c81b0652ac9c5816fb8b91dab9e22cd2839f94e2eaf19a74a-userdata-shm.mount: Deactivated successfully. Nov 23 05:01:59 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:01:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:01:59.455 263258 INFO neutron.agent.dhcp.agent [None req-fac1d73c-c18e-48aa-b6ed-40ff0f813cd0 - - - - - -] DHCP configuration for ports {'adfaba6e-566a-4b11-ba70-5bd47e5b5e70'} is completed#033[00m Nov 23 05:01:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:01:59.661 2 INFO neutron.agent.securitygroups_rpc [None req-a72c44cb-60bc-4c8a-a77d-8d03ce0529ac 2a1728c536894f859fec3b140f01d4cc fb9791d358174b77957d83c427c41282 - - default default] Security group member updated ['c43d5a51-ed13-44c5-81c1-fc0ae615d5d7']#033[00m Nov 23 05:01:59 localhost systemd[1]: tmp-crun.p65Fgr.mount: Deactivated successfully. Nov 23 05:01:59 localhost dnsmasq[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/addn_hosts - 0 addresses Nov 23 05:01:59 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/host Nov 23 05:01:59 localhost dnsmasq-dhcp[322200]: read /var/lib/neutron/dhcp/758e8da2-ad0b-4400-add6-179377986387/opts Nov 23 05:01:59 localhost podman[322583]: 2025-11-23 10:01:59.89373561 +0000 UTC m=+0.053906931 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:01:59 localhost openstack_network_exporter[242668]: ERROR 10:01:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:01:59 localhost openstack_network_exporter[242668]: ERROR 10:01:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:01:59 localhost openstack_network_exporter[242668]: ERROR 10:01:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:02:00 localhost openstack_network_exporter[242668]: ERROR 10:02:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:02:00 localhost openstack_network_exporter[242668]: Nov 23 05:02:00 localhost openstack_network_exporter[242668]: ERROR 10:02:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:02:00 localhost openstack_network_exporter[242668]: Nov 23 05:02:00 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:00.285 2 INFO neutron.agent.securitygroups_rpc [None req-b6b7fea1-91f0-4eed-a7d1-b3a653a6eb31 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:00 localhost dnsmasq[322200]: exiting on receipt of SIGTERM Nov 23 05:02:00 localhost podman[322622]: 2025-11-23 10:02:00.399425309 +0000 UTC m=+0.061011641 container kill 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:02:00 localhost systemd[1]: libpod-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope: Deactivated successfully. Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00298|binding|INFO|Removing iface tap911f7f8a-f8 ovn-installed in OVS Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00299|binding|INFO|Removing lport 911f7f8a-f8a0-4ea9-8b79-546ce102d99b ovn-installed in OVS Nov 23 05:02:00 localhost podman[322634]: 2025-11-23 10:02:00.469156907 +0000 UTC m=+0.059138143 container died 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.469 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.470 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 22fb3b05-2f99-4740-ad3a-43fe5cf32d1d with type ""#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.471 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-758e8da2-ad0b-4400-add6-179377986387', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-758e8da2-ad0b-4400-add6-179377986387', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb9791d358174b77957d83c427c41282', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44e94e65-0414-426a-b507-976914cc35fa, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=911f7f8a-f8a0-4ea9-8b79-546ce102d99b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.473 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 911f7f8a-f8a0-4ea9-8b79-546ce102d99b in datapath 758e8da2-ad0b-4400-add6-179377986387 unbound from our chassis#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.475 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 758e8da2-ad0b-4400-add6-179377986387, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.476 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.478 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[84a65e8e-6cdf-4451-b6b5-4bb988572f82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:00 localhost systemd[1]: tmp-crun.QBLesf.mount: Deactivated successfully. Nov 23 05:02:00 localhost podman[322634]: 2025-11-23 10:02:00.502008289 +0000 UTC m=+0.091989495 container cleanup 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:02:00 localhost systemd[1]: libpod-conmon-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137.scope: Deactivated successfully. Nov 23 05:02:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.531 263258 INFO neutron.agent.linux.ip_lib [None req-fb62524a-61ab-4b4c-9f08-b7ad33c8aa00 - - - - - -] Device tap1eae7604-c5 cannot be used as it has no MAC address#033[00m Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.556 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost kernel: device tap1eae7604-c5 entered promiscuous mode Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00300|binding|INFO|Claiming lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 for this chassis. Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00301|binding|INFO|1eae7604-c5d1-42c0-ba22-91f336e21eb6: Claiming unknown Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost NetworkManager[5975]: [1763892120.5653] manager: (tap1eae7604-c5): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Nov 23 05:02:00 localhost systemd-udevd[322668]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00302|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 ovn-installed in OVS Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.573 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost ovn_controller[154788]: 2025-11-23T10:02:00Z|00303|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 up in Southbound Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.577 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.578 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec2:891e/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1eae7604-c5d1-42c0-ba22-91f336e21eb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.580 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1eae7604-c5d1-42c0-ba22-91f336e21eb6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.583 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port e7b38be6-fb06-4882-a743-601ba140a474 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.583 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:00 localhost podman[322641]: 2025-11-23 10:02:00.584150189 +0000 UTC m=+0.158512024 container remove 12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-758e8da2-ad0b-4400-add6-179377986387, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:02:00 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:00.584 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[bb19d044-46ea-4426-a191-a4da3d41ca4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:00 localhost kernel: device tap911f7f8a-f8 left promiscuous mode Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.599 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.629 263258 INFO neutron.agent.dhcp.agent [None req-6dbabc39-1466-416b-bc6e-eb67358223e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.634 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost nova_compute[281952]: 2025-11-23 10:02:00.661 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:00.798 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:02:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 7241 writes, 30K keys, 7241 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s#012Cumulative WAL: 7241 writes, 1653 syncs, 4.38 writes per sync, written: 0.03 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2297 writes, 8467 keys, 2297 commit groups, 1.0 writes per commit group, ingest: 8.52 MB, 0.01 MB/s#012Interval WAL: 2297 writes, 988 syncs, 2.32 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 05:02:01 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:01.029 2 INFO neutron.agent.securitygroups_rpc [None req-632d2016-3c1b-4f91-9e80-c737b7a909f7 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:01 localhost ovn_controller[154788]: 2025-11-23T10:02:01Z|00304|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:01 localhost nova_compute[281952]: 2025-11-23 10:02:01.102 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:01 localhost systemd[1]: var-lib-containers-storage-overlay-53a9620dc369a622449f987892c4bdf312e7e2ee540f9f927180e16e47bb5267-merged.mount: Deactivated successfully. Nov 23 05:02:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12a18268b0ac36704a913da1805fb423207565b8d720e3e18b5bcfff2d66f137-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:01 localhost systemd[1]: run-netns-qdhcp\x2d758e8da2\x2dad0b\x2d4400\x2dadd6\x2d179377986387.mount: Deactivated successfully. Nov 23 05:02:01 localhost podman[322722]: Nov 23 05:02:01 localhost podman[322722]: 2025-11-23 10:02:01.478413887 +0000 UTC m=+0.064174748 container create 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:01 localhost systemd[1]: Started libpod-conmon-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope. Nov 23 05:02:01 localhost systemd[1]: Started libcrun container. Nov 23 05:02:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9417ceb2b435d04eebc44b707f3c79893d5ae8ab0419cd784f36ee62d1553931/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:01 localhost podman[322722]: 2025-11-23 10:02:01.447495275 +0000 UTC m=+0.033256156 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:01 localhost podman[322722]: 2025-11-23 10:02:01.551144678 +0000 UTC m=+0.136905569 container init 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:01 localhost nova_compute[281952]: 2025-11-23 10:02:01.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:01 localhost podman[322722]: 2025-11-23 10:02:01.557761451 +0000 UTC m=+0.143522312 container start 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:02:01 localhost dnsmasq[322740]: started, version 2.85 cachesize 150 Nov 23 05:02:01 localhost dnsmasq[322740]: DNS service limited to local subnets Nov 23 05:02:01 localhost dnsmasq[322740]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:01 localhost dnsmasq[322740]: warning: no upstream servers configured Nov 23 05:02:01 localhost dnsmasq-dhcp[322740]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:01 localhost nova_compute[281952]: 2025-11-23 10:02:01.565 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:01 localhost dnsmasq[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:01 localhost dnsmasq-dhcp[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:01 localhost dnsmasq-dhcp[322740]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:01.721 263258 INFO neutron.agent.dhcp.agent [None req-32946da0-bb78-43b4-82d5-69cf34036c1b - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:01 localhost dnsmasq[322740]: exiting on receipt of SIGTERM Nov 23 05:02:01 localhost podman[322758]: 2025-11-23 10:02:01.886794908 +0000 UTC m=+0.058341838 container kill 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:02:01 localhost systemd[1]: libpod-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope: Deactivated successfully. Nov 23 05:02:01 localhost podman[322770]: 2025-11-23 10:02:01.952813291 +0000 UTC m=+0.053988714 container died 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:02:01 localhost podman[322770]: 2025-11-23 10:02:01.98814877 +0000 UTC m=+0.089324123 container cleanup 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:01 localhost systemd[1]: libpod-conmon-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3.scope: Deactivated successfully. Nov 23 05:02:02 localhost podman[322777]: 2025-11-23 10:02:02.037068457 +0000 UTC m=+0.124048083 container remove 63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:02 localhost ovn_controller[154788]: 2025-11-23T10:02:02Z|00305|binding|INFO|Releasing lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 from this chassis (sb_readonly=0) Nov 23 05:02:02 localhost nova_compute[281952]: 2025-11-23 10:02:02.049 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:02 localhost ovn_controller[154788]: 2025-11-23T10:02:02Z|00306|binding|INFO|Setting lport 1eae7604-c5d1-42c0-ba22-91f336e21eb6 down in Southbound Nov 23 05:02:02 localhost kernel: device tap1eae7604-c5 left promiscuous mode Nov 23 05:02:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:02.059 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec2:891e/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1eae7604-c5d1-42c0-ba22-91f336e21eb6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:02.061 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1eae7604-c5d1-42c0-ba22-91f336e21eb6 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:02:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:02.063 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:02.064 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a5903e21-0c30-4c53-9df8-161e319d9660]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:02 localhost nova_compute[281952]: 2025-11-23 10:02:02.066 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:02.380 2 INFO neutron.agent.securitygroups_rpc [None req-daacf7a5-9b59-487b-876d-20ffffe4895d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:02 localhost systemd[1]: var-lib-containers-storage-overlay-9417ceb2b435d04eebc44b707f3c79893d5ae8ab0419cd784f36ee62d1553931-merged.mount: Deactivated successfully. Nov 23 05:02:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63073e175e49012e0c4a1d17e7f27398bd6ee2bba06af6489d468a8bbf3371c3-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:02 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:02:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:02.813 2 INFO neutron.agent.securitygroups_rpc [None req-7b96091d-4d68-47d5-ad40-bc5e46f787b5 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:02.970 263258 INFO neutron.agent.linux.ip_lib [None req-cc73e387-1f57-4444-b655-bd413813bfc9 - - - - - -] Device tap62791e54-5e cannot be used as it has no MAC address#033[00m Nov 23 05:02:02 localhost nova_compute[281952]: 2025-11-23 10:02:02.996 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost kernel: device tap62791e54-5e entered promiscuous mode Nov 23 05:02:03 localhost systemd-udevd[322670]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.004 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost NetworkManager[5975]: [1763892123.0048] manager: (tap62791e54-5e): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Nov 23 05:02:03 localhost ovn_controller[154788]: 2025-11-23T10:02:03Z|00307|binding|INFO|Claiming lport 62791e54-5e64-445a-ab2c-3e33a17041f0 for this chassis. Nov 23 05:02:03 localhost ovn_controller[154788]: 2025-11-23T10:02:03Z|00308|binding|INFO|62791e54-5e64-445a-ab2c-3e33a17041f0: Claiming unknown Nov 23 05:02:03 localhost ovn_controller[154788]: 2025-11-23T10:02:03Z|00309|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 ovn-installed in OVS Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.022 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost ovn_controller[154788]: 2025-11-23T10:02:03Z|00310|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 up in Southbound Nov 23 05:02:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:03.024 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=62791e54-5e64-445a-ab2c-3e33a17041f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:03.026 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 62791e54-5e64-445a-ab2c-3e33a17041f0 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:02:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port b6aa7198-8a06-47c3-9895-f076ba88da04 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:02:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:03.028 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6009671c-6d11-4103-bbb7-c79269669e21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.034 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.042 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost nova_compute[281952]: 2025-11-23 10:02:03.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:03 localhost podman[322863]: Nov 23 05:02:03 localhost podman[322863]: 2025-11-23 10:02:03.963509992 +0000 UTC m=+0.092856202 container create ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:02:04 localhost systemd[1]: Started libpod-conmon-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope. Nov 23 05:02:04 localhost podman[322863]: 2025-11-23 10:02:03.919503696 +0000 UTC m=+0.048849936 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:04 localhost systemd[1]: tmp-crun.C0okSU.mount: Deactivated successfully. Nov 23 05:02:04 localhost systemd[1]: Started libcrun container. Nov 23 05:02:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da620b16801967997c7241de13bd854d2a48763c81d260fa8a5c934b2c99f79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:04 localhost podman[322863]: 2025-11-23 10:02:04.045548589 +0000 UTC m=+0.174894799 container init ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:04 localhost podman[322863]: 2025-11-23 10:02:04.054489805 +0000 UTC m=+0.183836025 container start ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:02:04 localhost dnsmasq[322881]: started, version 2.85 cachesize 150 Nov 23 05:02:04 localhost dnsmasq[322881]: DNS service limited to local subnets Nov 23 05:02:04 localhost dnsmasq[322881]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:04 localhost dnsmasq[322881]: warning: no upstream servers configured Nov 23 05:02:04 localhost dnsmasq-dhcp[322881]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:04 localhost dnsmasq[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 1 addresses Nov 23 05:02:04 localhost dnsmasq-dhcp[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:04 localhost dnsmasq-dhcp[322881]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.216 263258 INFO neutron.agent.dhcp.agent [None req-4a694251-3129-465d-b182-42a3f4e8c1c3 - - - - - -] DHCP configuration for ports {'aea50331-ae74-4951-a33a-a93d73ae2d3e', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:04 localhost dnsmasq[322881]: exiting on receipt of SIGTERM Nov 23 05:02:04 localhost podman[322897]: 2025-11-23 10:02:04.36011192 +0000 UTC m=+0.058820773 container kill ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:02:04 localhost systemd[1]: libpod-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope: Deactivated successfully. Nov 23 05:02:04 localhost podman[322911]: 2025-11-23 10:02:04.432590743 +0000 UTC m=+0.060455464 container died ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:04 localhost podman[322911]: 2025-11-23 10:02:04.463642289 +0000 UTC m=+0.091506950 container cleanup ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:02:04 localhost systemd[1]: libpod-conmon-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb.scope: Deactivated successfully. Nov 23 05:02:04 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:04.479 2 INFO neutron.agent.securitygroups_rpc [None req-69ba403f-ba50-44e1-b8d0-5abce2396fa5 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:04 localhost podman[322913]: 2025-11-23 10:02:04.508694387 +0000 UTC m=+0.127566061 container remove ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:02:04 localhost ovn_controller[154788]: 2025-11-23T10:02:04Z|00311|binding|INFO|Releasing lport 62791e54-5e64-445a-ab2c-3e33a17041f0 from this chassis (sb_readonly=0) Nov 23 05:02:04 localhost nova_compute[281952]: 2025-11-23 10:02:04.522 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:04 localhost kernel: device tap62791e54-5e left promiscuous mode Nov 23 05:02:04 localhost ovn_controller[154788]: 2025-11-23T10:02:04Z|00312|binding|INFO|Setting lport 62791e54-5e64-445a-ab2c-3e33a17041f0 down in Southbound Nov 23 05:02:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:04.535 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=62791e54-5e64-445a-ab2c-3e33a17041f0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:04.537 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 62791e54-5e64-445a-ab2c-3e33a17041f0 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:02:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:04.539 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:04.539 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6cab7de9-3e49-46f1-b977-769b76d3a2fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:04 localhost nova_compute[281952]: 2025-11-23 10:02:04.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:02:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s#012Cumulative WAL: 10K writes, 2789 syncs, 3.74 writes per sync, written: 0.04 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4513 writes, 15K keys, 4513 commit groups, 1.0 writes per commit group, ingest: 16.92 MB, 0.03 MB/s#012Interval WAL: 4513 writes, 1925 syncs, 2.34 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 05:02:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.790 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.791 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.791 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:04.792 263258 INFO neutron.agent.dhcp.agent [None req-8fd70415-dec8-4897-b787-16d5eaa3f6d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:04 localhost systemd[1]: var-lib-containers-storage-overlay-4da620b16801967997c7241de13bd854d2a48763c81d260fa8a5c934b2c99f79-merged.mount: Deactivated successfully. Nov 23 05:02:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef57a606756810a388c5964ca2867d10a96e21e6c6fe2913c735a196dbe53fdb-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:04 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:02:05 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:02:05 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:02:05 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:02:05 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3821055355' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:02:06 localhost podman[322941]: 2025-11-23 10:02:06.031466096 +0000 UTC m=+0.086614350 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible) Nov 23 05:02:06 localhost podman[322941]: 2025-11-23 10:02:06.040707421 +0000 UTC m=+0.095855645 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:02:06 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:02:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:06.078 263258 INFO neutron.agent.linux.ip_lib [None req-3816ae8e-7ce5-49fc-ab68-7ae269013f24 - - - - - -] Device tapc97f425d-0e cannot be used as it has no MAC address#033[00m Nov 23 05:02:06 localhost podman[322942]: 2025-11-23 10:02:06.082622782 +0000 UTC m=+0.133619487 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.105 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost kernel: device tapc97f425d-0e entered promiscuous mode Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost NetworkManager[5975]: [1763892126.1124] manager: (tapc97f425d-0e): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Nov 23 05:02:06 localhost ovn_controller[154788]: 2025-11-23T10:02:06Z|00313|binding|INFO|Claiming lport c97f425d-0e2f-4212-85de-246b74cc4178 for this chassis. Nov 23 05:02:06 localhost ovn_controller[154788]: 2025-11-23T10:02:06Z|00314|binding|INFO|c97f425d-0e2f-4212-85de-246b74cc4178: Claiming unknown Nov 23 05:02:06 localhost podman[322942]: 2025-11-23 10:02:06.114523895 +0000 UTC m=+0.165520650 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:02:06 localhost ovn_controller[154788]: 2025-11-23T10:02:06Z|00315|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 ovn-installed in OVS Nov 23 05:02:06 localhost systemd-udevd[322991]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.118 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost ovn_controller[154788]: 2025-11-23T10:02:06Z|00316|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 up in Southbound Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:06.151 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe7a:311b/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c97f425d-0e2f-4212-85de-246b74cc4178) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:06.153 160439 INFO neutron.agent.ovn.metadata.agent [-] Port c97f425d-0e2f-4212-85de-246b74cc4178 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:02:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:06.155 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port c19594a7-8cd2-4205-b5d1-3fd9bc8de3f5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:02:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:06.155 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:06.156 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f97d43b0-89e8-4b53-9d8a-68c198c4aaa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost journal[230249]: ethtool ioctl error on tapc97f425d-0e: No such device Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.182 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:06.464 2 INFO neutron.agent.securitygroups_rpc [None req-027c2ba7-b3e9-44a3-b67a-0b2bdc3ad43a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost nova_compute[281952]: 2025-11-23 10:02:06.567 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:06.597 2 INFO neutron.agent.securitygroups_rpc [None req-027c2ba7-b3e9-44a3-b67a-0b2bdc3ad43a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:06.996 263258 INFO neutron.agent.linux.ip_lib [None req-00365568-31c1-4a8e-b7e0-06fa1d25d2f7 - - - - - -] Device tap4888f66e-2a cannot be used as it has no MAC address#033[00m Nov 23 05:02:07 localhost nova_compute[281952]: 2025-11-23 10:02:07.023 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:07 localhost kernel: device tap4888f66e-2a entered promiscuous mode Nov 23 05:02:07 localhost NetworkManager[5975]: [1763892127.0293] manager: (tap4888f66e-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Nov 23 05:02:07 localhost nova_compute[281952]: 2025-11-23 10:02:07.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:07 localhost ovn_controller[154788]: 2025-11-23T10:02:07Z|00317|binding|INFO|Claiming lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 for this chassis. Nov 23 05:02:07 localhost ovn_controller[154788]: 2025-11-23T10:02:07Z|00318|binding|INFO|4888f66e-2a7b-4114-aa4a-94f38d09c793: Claiming unknown Nov 23 05:02:07 localhost systemd-udevd[322994]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:07.047 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cf4fe09-32ab-44a4-bc92-e7a5f2b64203, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4888f66e-2a7b-4114-aa4a-94f38d09c793) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:07.049 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4888f66e-2a7b-4114-aa4a-94f38d09c793 in datapath e392c0bc-bd43-40a4-a7d7-6e0130e48060 bound to our chassis#033[00m Nov 23 05:02:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:07.051 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e392c0bc-bd43-40a4-a7d7-6e0130e48060 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:07.052 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[519e9eef-161c-46d9-94bd-50ab07aba296]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost podman[323067]: Nov 23 05:02:07 localhost ovn_controller[154788]: 2025-11-23T10:02:07Z|00319|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 ovn-installed in OVS Nov 23 05:02:07 localhost ovn_controller[154788]: 2025-11-23T10:02:07Z|00320|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 up in Southbound Nov 23 05:02:07 localhost nova_compute[281952]: 2025-11-23 10:02:07.063 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost podman[323067]: 2025-11-23 10:02:07.0700686 +0000 UTC m=+0.093702947 container create 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:07 localhost ovn_controller[154788]: 2025-11-23T10:02:07Z|00321|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:07.100 2 INFO neutron.agent.securitygroups_rpc [None req-ed1fae3b-c78a-4940-9647-1237b664bff1 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:07 localhost journal[230249]: ethtool ioctl error on tap4888f66e-2a: No such device Nov 23 05:02:07 localhost nova_compute[281952]: 2025-11-23 10:02:07.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:07 localhost systemd[1]: Started libpod-conmon-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope. Nov 23 05:02:07 localhost podman[323067]: 2025-11-23 10:02:07.025028233 +0000 UTC m=+0.048662610 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:07.124 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:07 localhost nova_compute[281952]: 2025-11-23 10:02:07.139 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:07 localhost systemd[1]: Started libcrun container. Nov 23 05:02:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fe118de034c9dcbe06dbd482ab6b32f3c606b4624d6aae6333387bec542e04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:07 localhost podman[323067]: 2025-11-23 10:02:07.172258499 +0000 UTC m=+0.195892836 container init 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:07 localhost podman[323067]: 2025-11-23 10:02:07.180180502 +0000 UTC m=+0.203814839 container start 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:02:07 localhost dnsmasq[323115]: started, version 2.85 cachesize 150 Nov 23 05:02:07 localhost dnsmasq[323115]: DNS service limited to local subnets Nov 23 05:02:07 localhost dnsmasq[323115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:07 localhost dnsmasq[323115]: warning: no upstream servers configured Nov 23 05:02:07 localhost dnsmasq[323115]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:07.254 2 INFO neutron.agent.securitygroups_rpc [None req-57d318cc-de08-4b5e-a1a2-97e0506818c3 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:07.324 263258 INFO neutron.agent.dhcp.agent [None req-ee5cd8f9-998a-4438-9d3c-0b83bc860687 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:07 localhost dnsmasq[323115]: exiting on receipt of SIGTERM Nov 23 05:02:07 localhost podman[323146]: 2025-11-23 10:02:07.528685748 +0000 UTC m=+0.065519460 container kill 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:02:07 localhost systemd[1]: libpod-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope: Deactivated successfully. Nov 23 05:02:07 localhost podman[323165]: 2025-11-23 10:02:07.601085259 +0000 UTC m=+0.054751098 container died 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:02:07 localhost podman[323165]: 2025-11-23 10:02:07.627344107 +0000 UTC m=+0.081009916 container cleanup 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:02:07 localhost systemd[1]: libpod-conmon-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e.scope: Deactivated successfully. Nov 23 05:02:07 localhost podman[323167]: 2025-11-23 10:02:07.678242585 +0000 UTC m=+0.125998942 container remove 0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:07 localhost podman[323217]: Nov 23 05:02:07 localhost podman[323217]: 2025-11-23 10:02:07.963553364 +0000 UTC m=+0.088283791 container create 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 23 05:02:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:07.982 2 INFO neutron.agent.securitygroups_rpc [None req-fccb08e0-0b04-4e9e-9f46-388ae08a5315 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:07 localhost systemd[1]: Started libpod-conmon-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope. Nov 23 05:02:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.004 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:08 localhost systemd[1]: Started libcrun container. Nov 23 05:02:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85794c23db926913628cec6db6b73789ec462daef8098b4cddf986aad56e49c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:08 localhost podman[323217]: 2025-11-23 10:02:08.021282483 +0000 UTC m=+0.146012910 container init 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:02:08 localhost podman[323217]: 2025-11-23 10:02:07.924067358 +0000 UTC m=+0.048797835 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:08 localhost podman[323217]: 2025-11-23 10:02:08.031505198 +0000 UTC m=+0.156235635 container start 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:08 localhost dnsmasq[323235]: started, version 2.85 cachesize 150 Nov 23 05:02:08 localhost dnsmasq[323235]: DNS service limited to local subnets Nov 23 05:02:08 localhost dnsmasq[323235]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:08 localhost dnsmasq[323235]: warning: no upstream servers configured Nov 23 05:02:08 localhost dnsmasq-dhcp[323235]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Nov 23 05:02:08 localhost dnsmasq[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/addn_hosts - 0 addresses Nov 23 05:02:08 localhost dnsmasq-dhcp[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/host Nov 23 05:02:08 localhost dnsmasq-dhcp[323235]: read /var/lib/neutron/dhcp/e392c0bc-bd43-40a4-a7d7-6e0130e48060/opts Nov 23 05:02:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.039 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:08 localhost systemd[1]: var-lib-containers-storage-overlay-a3fe118de034c9dcbe06dbd482ab6b32f3c606b4624d6aae6333387bec542e04-merged.mount: Deactivated successfully. Nov 23 05:02:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e6bee6eeb2e7d82186d12524e9966b007e53259f220d347189a433a17ec7d1e-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.172 263258 INFO neutron.agent.dhcp.agent [None req-d624a739-ef82-4e54-98ae-a94d64679190 - - - - - -] DHCP configuration for ports {'475fd203-4bae-4a13-96a3-3a4ff6625465'} is completed#033[00m Nov 23 05:02:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:08.227 2 INFO neutron.agent.securitygroups_rpc [None req-ccd00d64-5365-45b4-a98a-49c5095f8557 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e116 e116: 6 total, 6 up, 6 in Nov 23 05:02:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:08.753 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:08 localhost podman[323287]: Nov 23 05:02:09 localhost podman[323287]: 2025-11-23 10:02:09.006819882 +0000 UTC m=+0.086369820 container create 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:02:09 localhost systemd[1]: Started libpod-conmon-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope. Nov 23 05:02:09 localhost systemd[1]: Started libcrun container. Nov 23 05:02:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6787532a03028bbb9352430e4ee923335f00773afda3bc4b41f6a4b5bdbf779c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:09 localhost podman[323287]: 2025-11-23 10:02:09.061565259 +0000 UTC m=+0.141115117 container init 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:02:09 localhost podman[323287]: 2025-11-23 10:02:08.964033335 +0000 UTC m=+0.043583213 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:09 localhost podman[323287]: 2025-11-23 10:02:09.06840547 +0000 UTC m=+0.147955368 container start 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:02:09 localhost dnsmasq[323305]: started, version 2.85 cachesize 150 Nov 23 05:02:09 localhost dnsmasq[323305]: DNS service limited to local subnets Nov 23 05:02:09 localhost dnsmasq[323305]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:09 localhost dnsmasq[323305]: warning: no upstream servers configured Nov 23 05:02:09 localhost dnsmasq-dhcp[323305]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:02:09 localhost dnsmasq[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:02:09 localhost dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:09 localhost dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.128 263258 INFO neutron.agent.dhcp.agent [None req-80643ad2-522f-449b-989f-ac461e6ca619 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:06Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=829fd08c-2cab-4a3b-8d7c-a1d746a46be5, ip_allocation=immediate, mac_address=fa:16:3e:30:b5:e7, name=tempest-NetworksTestDHCPv6-1529993794, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['117abf22-f342-429e-a859-0a947d5758f7', 'd72bfe7d-5cae-4bc9-8351-3414fff06dc1'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:06Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1746, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:07Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:02:09 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:09.136 2 INFO neutron.agent.securitygroups_rpc [None req-4a4cd2de-ce5b-469c-97bb-90c17373d140 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']#033[00m Nov 23 05:02:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.292 263258 INFO neutron.agent.dhcp.agent [None req-ebc69321-ccee-4eeb-974c-63c06a80ad1d - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', '829fd08c-2cab-4a3b-8d7c-a1d746a46be5', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:02:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:02:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:02:09 localhost dnsmasq[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:02:09 localhost podman[323323]: 2025-11-23 10:02:09.309875359 +0000 UTC m=+0.066080097 container kill 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:02:09 localhost dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:09 localhost dnsmasq-dhcp[323305]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:09.517 263258 INFO neutron.agent.dhcp.agent [None req-d599d47a-4620-41bc-b0a4-07672c4f0d6b - - - - - -] DHCP configuration for ports {'829fd08c-2cab-4a3b-8d7c-a1d746a46be5'} is completed#033[00m Nov 23 05:02:09 localhost dnsmasq[323305]: exiting on receipt of SIGTERM Nov 23 05:02:09 localhost podman[323360]: 2025-11-23 10:02:09.722372796 +0000 UTC m=+0.065141768 container kill 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:02:09 localhost systemd[1]: libpod-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope: Deactivated successfully. Nov 23 05:02:09 localhost podman[323375]: 2025-11-23 10:02:09.804728632 +0000 UTC m=+0.060686650 container died 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:09 localhost podman[323375]: 2025-11-23 10:02:09.847727168 +0000 UTC m=+0.103685156 container remove 1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:02:09 localhost systemd[1]: libpod-conmon-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa.scope: Deactivated successfully. Nov 23 05:02:10 localhost systemd[1]: var-lib-containers-storage-overlay-6787532a03028bbb9352430e4ee923335f00773afda3bc4b41f6a4b5bdbf779c-merged.mount: Deactivated successfully. Nov 23 05:02:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e2d679070591b23bd0cbecd7adce368ae0da1f77b57cc01ebf8d1b5871461aa-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:10.545 263258 INFO neutron.agent.dhcp.agent [None req-16bec703-84f3-4c4a-82cf-01da7f4b94a4 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:11.229 263258 INFO neutron.agent.linux.ip_lib [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Device tapa448321b-77 cannot be used as it has no MAC address#033[00m Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost kernel: device tapa448321b-77 entered promiscuous mode Nov 23 05:02:11 localhost NetworkManager[5975]: [1763892131.2921] manager: (tapa448321b-77): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Nov 23 05:02:11 localhost ovn_controller[154788]: 2025-11-23T10:02:11Z|00322|binding|INFO|Claiming lport a448321b-7787-4303-b318-0f7b37915029 for this chassis. Nov 23 05:02:11 localhost ovn_controller[154788]: 2025-11-23T10:02:11Z|00323|binding|INFO|a448321b-7787-4303-b318-0f7b37915029: Claiming unknown Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.294 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost systemd-udevd[323410]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:11.305 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=083064ad-1b8e-4a49-952e-9175ecca48be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a448321b-7787-4303-b318-0f7b37915029) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:11.307 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a448321b-7787-4303-b318-0f7b37915029 in datapath 4267129b-8796-478e-a8a2-d9eb57ec8730 bound to our chassis#033[00m Nov 23 05:02:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:11.309 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4267129b-8796-478e-a8a2-d9eb57ec8730 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:11.310 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9e60e5a6-0e2d-427f-b351-a754ecdcc38e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost ovn_controller[154788]: 2025-11-23T10:02:11Z|00324|binding|INFO|Setting lport a448321b-7787-4303-b318-0f7b37915029 ovn-installed in OVS Nov 23 05:02:11 localhost ovn_controller[154788]: 2025-11-23T10:02:11Z|00325|binding|INFO|Setting lport a448321b-7787-4303-b318-0f7b37915029 up in Southbound Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost journal[230249]: ethtool ioctl error on tapa448321b-77: No such device Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.404 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost nova_compute[281952]: 2025-11-23 10:02:11.569 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:11 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:11.779 2 INFO neutron.agent.securitygroups_rpc [None req-714530e0-ede7-43a6-b513-9acc4fe9127a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:11 localhost podman[240668]: time="2025-11-23T10:02:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:02:11 localhost podman[240668]: @ - - [23/Nov/2025:10:02:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:02:11 localhost podman[240668]: @ - - [23/Nov/2025:10:02:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1" Nov 23 05:02:12 localhost podman[323509]: Nov 23 05:02:12 localhost podman[323509]: 2025-11-23 10:02:12.304329474 +0000 UTC m=+0.091553101 container create b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:02:12 localhost systemd[1]: Started libpod-conmon-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope. Nov 23 05:02:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:12.358 2 INFO neutron.agent.securitygroups_rpc [None req-c5ba658c-cffc-41f4-af3e-933fb394a1b2 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']#033[00m Nov 23 05:02:12 localhost podman[323509]: 2025-11-23 10:02:12.260331309 +0000 UTC m=+0.047554956 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:12 localhost systemd[1]: Started libcrun container. Nov 23 05:02:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0c0334387ce4a5254ea7688c110c9fde45e19d5a61b9cda522ce0194ebbeff3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:12 localhost podman[323509]: 2025-11-23 10:02:12.470165773 +0000 UTC m=+0.257389400 container init b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:02:12 localhost podman[323509]: 2025-11-23 10:02:12.478468859 +0000 UTC m=+0.265692476 container start b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:02:12 localhost dnsmasq[323543]: started, version 2.85 cachesize 150 Nov 23 05:02:12 localhost dnsmasq[323543]: DNS service limited to local subnets Nov 23 05:02:12 localhost dnsmasq[323543]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:12 localhost dnsmasq[323543]: warning: no upstream servers configured Nov 23 05:02:12 localhost dnsmasq-dhcp[323543]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:12 localhost dnsmasq[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/addn_hosts - 0 addresses Nov 23 05:02:12 localhost dnsmasq-dhcp[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/host Nov 23 05:02:12 localhost dnsmasq-dhcp[323543]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/opts Nov 23 05:02:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:12.540 2 INFO neutron.agent.securitygroups_rpc [None req-c5ba658c-cffc-41f4-af3e-933fb394a1b2 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']#033[00m Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.542 263258 INFO neutron.agent.dhcp.agent [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:10Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36e753ee-ee5c-438f-999a-778d9317f27a, ip_allocation=immediate, mac_address=fa:16:3e:ed:a8:f0, name=tempest-PortsIpV6TestJSON-1794941810, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:09Z, description=, dns_domain=, id=4267129b-8796-478e-a8a2-d9eb57ec8730, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-375175599, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51530, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1763, status=ACTIVE, subnets=['9657efcc-5829-4691-aba6-3cbf999666ad'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:10Z, vlan_transparent=None, network_id=4267129b-8796-478e-a8a2-d9eb57ec8730, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=1779, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:11Z on network 4267129b-8796-478e-a8a2-d9eb57ec8730#033[00m Nov 23 05:02:12 localhost dnsmasq[323543]: exiting on receipt of SIGTERM Nov 23 05:02:12 localhost systemd[1]: libpod-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope: Deactivated successfully. Nov 23 05:02:12 localhost podman[323550]: 2025-11-23 10:02:12.588639782 +0000 UTC m=+0.080034157 container died b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.588 263258 ERROR neutron.agent.linux.utils [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543 Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: ; Stderr: Cannot open network namespace: No such file or directory Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: #033[00m Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent [None req-5777096e-cc44-4527-8d97-30310acb1631 - - - - - -] Unable to reload_allocations dhcp for 4267129b-8796-478e-a8a2-d9eb57ec8730.: neutron_lib.exceptions.ProcessExecutionError: Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543 Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: ; Stderr: Cannot open network namespace: No such file or directory Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 671, in reload_allocations Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent self._spawn_or_reload_process(reload_with_HUP=True) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 603, in _spawn_or_reload_process Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent pm.enable(reload_cfg=reload_with_HUP, ensure_active=True) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 108, in enable Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent self.reload_cfg() Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 117, in reload_cfg Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent self.disable('HUP', delete_pid_file=False) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 132, in disable Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent utils.execute(cmd, addl_env=self.cmd_addl_env, Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py", line 156, in execute Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent raise exceptions.ProcessExecutionError(msg, Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent neutron_lib.exceptions.ProcessExecutionError: Exit code: 1; Cmd: ['/etc/neutron/kill_scripts/dnsmasq-kill', 'HUP', 323543]; Stdin: ; Stdout: Sun Nov 23 10:02:12 AM UTC 2025 No such PID: 323543 Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent ; Stderr: Cannot open network namespace: No such file or directory Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.589 263258 ERROR neutron.agent.dhcp.agent #033[00m Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.606 263258 INFO neutron.agent.dhcp.agent [None req-3d493ef3-82b2-4169-8cde-5402b06203db - - - - - -] DHCP configuration for ports {'1b960541-5cae-4b07-9a4c-5f405e43807e'} is completed#033[00m Nov 23 05:02:12 localhost podman[323550]: 2025-11-23 10:02:12.618288126 +0000 UTC m=+0.109682431 container cleanup b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:02:12 localhost podman[323566]: 2025-11-23 10:02:12.661322431 +0000 UTC m=+0.066481899 container cleanup b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:02:12 localhost systemd[1]: libpod-conmon-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a.scope: Deactivated successfully. Nov 23 05:02:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:12.695 2 INFO neutron.agent.securitygroups_rpc [None req-972382fb-338a-4217-8905-e67aa90103fe 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:12 localhost podman[323578]: 2025-11-23 10:02:12.71125032 +0000 UTC m=+0.079023366 container remove b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.716 263258 INFO neutron.agent.dhcp.agent [None req-b7d11b5f-b09d-4d01-875f-b7e040b9179f - - - - - -] DHCP configuration for ports {'36e753ee-ee5c-438f-999a-778d9317f27a'} is completed#033[00m Nov 23 05:02:12 localhost podman[323598]: Nov 23 05:02:12 localhost podman[323598]: 2025-11-23 10:02:12.819204165 +0000 UTC m=+0.088748195 container create e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:02:12 localhost systemd[1]: Started libpod-conmon-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope. Nov 23 05:02:12 localhost systemd[1]: Started libcrun container. Nov 23 05:02:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53352129b85bc25f3136ae709db7dcaa40905d3bf7355e014a7309f881e89f02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:12 localhost podman[323598]: 2025-11-23 10:02:12.778305835 +0000 UTC m=+0.047849905 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:12 localhost podman[323598]: 2025-11-23 10:02:12.881332278 +0000 UTC m=+0.150876308 container init e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:02:12 localhost podman[323598]: 2025-11-23 10:02:12.891968166 +0000 UTC m=+0.161512186 container start e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:02:12 localhost dnsmasq[323616]: started, version 2.85 cachesize 150 Nov 23 05:02:12 localhost dnsmasq[323616]: DNS service limited to local subnets Nov 23 05:02:12 localhost dnsmasq[323616]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:12 localhost dnsmasq[323616]: warning: no upstream servers configured Nov 23 05:02:12 localhost dnsmasq-dhcp[323616]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:12 localhost dnsmasq[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:12 localhost dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:12 localhost dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:12.956 263258 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 23 05:02:13 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:13.206 2 INFO neutron.agent.securitygroups_rpc [None req-e36b687d-978f-4757-a141-a3de7329fae8 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']#033[00m Nov 23 05:02:13 localhost nova_compute[281952]: 2025-11-23 10:02:13.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.295 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.296 263258 INFO neutron.agent.dhcp.agent [-] Starting network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.297 263258 INFO neutron.agent.dhcp.agent [-] Finished network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.297 263258 INFO neutron.agent.dhcp.agent [-] Starting network 4267129b-8796-478e-a8a2-d9eb57ec8730 dhcp configuration#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.301 263258 INFO neutron.agent.dhcp.agent [None req-ed93737c-d55e-4174-a55d-940593f9459b - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:13 localhost systemd[1]: var-lib-containers-storage-overlay-d0c0334387ce4a5254ea7688c110c9fde45e19d5a61b9cda522ce0194ebbeff3-merged.mount: Deactivated successfully. Nov 23 05:02:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9f3bedc774515a5f1f9caee6a801e462761f8b05abb9bda76e261abe233fa0a-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.409 263258 INFO neutron.agent.dhcp.agent [None req-77a37596-627f-4059-9200-c97046b46160 - - - - - -] Finished network 4267129b-8796-478e-a8a2-d9eb57ec8730 dhcp configuration#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.409 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] Synchronizing state complete#033[00m Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.545 263258 INFO neutron.agent.dhcp.agent [None req-a300361f-abab-4ba8-85d4-d92bfc23a2a5 - - - - - -] DHCP configuration for ports {'1b960541-5cae-4b07-9a4c-5f405e43807e', '36e753ee-ee5c-438f-999a-778d9317f27a'} is completed#033[00m Nov 23 05:02:13 localhost dnsmasq[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:13 localhost dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:13 localhost dnsmasq-dhcp[323616]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:13 localhost podman[323637]: 2025-11-23 10:02:13.585410689 +0000 UTC m=+0.032854494 container kill e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:02:13 localhost podman[323680]: Nov 23 05:02:13 localhost podman[323680]: 2025-11-23 10:02:13.859245014 +0000 UTC m=+0.073161275 container create 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:13 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:13.865 2 INFO neutron.agent.securitygroups_rpc [None req-072dc881-3667-4cc2-b265-d09f039c6880 404dc86aa2d547d1b035a15b21cd31a6 3bcc515473444ea195be635c77c65d0f - - default default] Security group member updated ['742a2d87-87e7-4137-bfc5-9bbcf8237faa']#033[00m Nov 23 05:02:13 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:13.892 2 INFO neutron.agent.securitygroups_rpc [None req-a4d2f133-97c5-4bde-ae86-9705c747c91a 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:13 localhost podman[323680]: 2025-11-23 10:02:13.816198498 +0000 UTC m=+0.030114779 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:13 localhost systemd[1]: Started libpod-conmon-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope. Nov 23 05:02:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:13.921 263258 INFO neutron.agent.dhcp.agent [None req-7e1eb57e-69a5-4977-b4bb-223fad13a781 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:13 localhost systemd[1]: Started libcrun container. Nov 23 05:02:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d911125f48d63f87fc5e495f0553a4d7d8756cf37aca9beb20580dcfaf22988/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:13 localhost ovn_controller[154788]: 2025-11-23T10:02:13Z|00326|binding|INFO|Removing iface tapa448321b-77 ovn-installed in OVS Nov 23 05:02:13 localhost ovn_controller[154788]: 2025-11-23T10:02:13Z|00327|binding|INFO|Removing lport a448321b-7787-4303-b318-0f7b37915029 ovn-installed in OVS Nov 23 05:02:13 localhost nova_compute[281952]: 2025-11-23 10:02:13.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:13.973 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 53a85ccd-a35f-4a4b-800b-d124b2117401 with type ""#033[00m Nov 23 05:02:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:13.975 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4267129b-8796-478e-a8a2-d9eb57ec8730', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=083064ad-1b8e-4a49-952e-9175ecca48be, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a448321b-7787-4303-b318-0f7b37915029) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:13.976 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a448321b-7787-4303-b318-0f7b37915029 in datapath 4267129b-8796-478e-a8a2-d9eb57ec8730 unbound from our chassis#033[00m Nov 23 05:02:13 localhost nova_compute[281952]: 2025-11-23 10:02:13.978 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:13.978 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4267129b-8796-478e-a8a2-d9eb57ec8730 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:13.979 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3ddc62a3-fac1-4eb8-ae36-0290cdc36b70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:14 localhost podman[323680]: 2025-11-23 10:02:14.001718262 +0000 UTC m=+0.215634543 container init 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:02:14 localhost podman[323680]: 2025-11-23 10:02:14.0145972 +0000 UTC m=+0.228513511 container start 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:14 localhost dnsmasq[323727]: started, version 2.85 cachesize 150 Nov 23 05:02:14 localhost dnsmasq[323727]: DNS service limited to local subnets Nov 23 05:02:14 localhost dnsmasq[323727]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:14 localhost dnsmasq[323727]: warning: no upstream servers configured Nov 23 05:02:14 localhost dnsmasq-dhcp[323727]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:14 localhost dnsmasq[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/addn_hosts - 0 addresses Nov 23 05:02:14 localhost dnsmasq-dhcp[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/host Nov 23 05:02:14 localhost dnsmasq-dhcp[323727]: read /var/lib/neutron/dhcp/4267129b-8796-478e-a8a2-d9eb57ec8730/opts Nov 23 05:02:14 localhost dnsmasq[323616]: exiting on receipt of SIGTERM Nov 23 05:02:14 localhost podman[323714]: 2025-11-23 10:02:14.075132735 +0000 UTC m=+0.091157990 container kill e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:02:14 localhost systemd[1]: libpod-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope: Deactivated successfully. Nov 23 05:02:14 localhost podman[323733]: 2025-11-23 10:02:14.141344104 +0000 UTC m=+0.050809366 container died e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:14 localhost podman[323733]: 2025-11-23 10:02:14.191591342 +0000 UTC m=+0.101056554 container remove e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:02:14 localhost systemd[1]: libpod-conmon-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339.scope: Deactivated successfully. Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:02:14 localhost systemd[1]: var-lib-containers-storage-overlay-53352129b85bc25f3136ae709db7dcaa40905d3bf7355e014a7309f881e89f02-merged.mount: Deactivated successfully. Nov 23 05:02:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e21acd15ac630adc687d47c8fd2826391618cf3a909b08b21fc60e7599ccb339-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:14 localhost dnsmasq[323727]: exiting on receipt of SIGTERM Nov 23 05:02:14 localhost podman[323780]: 2025-11-23 10:02:14.482958577 +0000 UTC m=+0.069789871 container kill 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:02:14 localhost systemd[1]: libpod-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope: Deactivated successfully. Nov 23 05:02:14 localhost podman[323794]: 2025-11-23 10:02:14.559682692 +0000 UTC m=+0.054951665 container died 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:14 localhost podman[323794]: 2025-11-23 10:02:14.664340425 +0000 UTC m=+0.159609418 container remove 0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4267129b-8796-478e-a8a2-d9eb57ec8730, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:14 localhost systemd[1]: libpod-conmon-0ef3c55fba3c259af9e1d37be44221e078bd034f292fa623f2af6ed1e88cdeae.scope: Deactivated successfully. Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.678 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:14 localhost kernel: device tapa448321b-77 left promiscuous mode Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.690 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.691 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.691 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.692 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:14.741 263258 INFO neutron.agent.dhcp.agent [None req-1d83807c-834f-4f0c-93d7-c7972c1e7f84 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:14.742 263258 INFO neutron.agent.dhcp.agent [None req-1d83807c-834f-4f0c-93d7-c7972c1e7f84 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:14.822 2 INFO neutron.agent.securitygroups_rpc [None req-5062901d-b55d-4422-a232-c0dc20b0538f 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']#033[00m Nov 23 05:02:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:14.841 2 INFO neutron.agent.securitygroups_rpc [None req-e9c4a6cc-b7de-4f6d-9d17-231261e88eb0 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:14 localhost ovn_controller[154788]: 2025-11-23T10:02:14Z|00328|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:14 localhost nova_compute[281952]: 2025-11-23 10:02:14.902 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:15 localhost systemd[1]: var-lib-containers-storage-overlay-3d911125f48d63f87fc5e495f0553a4d7d8756cf37aca9beb20580dcfaf22988-merged.mount: Deactivated successfully. Nov 23 05:02:15 localhost systemd[1]: run-netns-qdhcp\x2d4267129b\x2d8796\x2d478e\x2da8a2\x2dd9eb57ec8730.mount: Deactivated successfully. Nov 23 05:02:15 localhost nova_compute[281952]: 2025-11-23 10:02:15.393 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:02:15 localhost nova_compute[281952]: 2025-11-23 10:02:15.416 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:02:15 localhost nova_compute[281952]: 2025-11-23 10:02:15.417 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:02:15 localhost podman[323869]: Nov 23 05:02:15 localhost podman[323869]: 2025-11-23 10:02:15.776235488 +0000 UTC m=+0.085711482 container create 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:02:15 localhost systemd[1]: Started libpod-conmon-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope. Nov 23 05:02:15 localhost podman[323869]: 2025-11-23 10:02:15.727026682 +0000 UTC m=+0.036502686 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:15 localhost systemd[1]: Started libcrun container. Nov 23 05:02:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f6a260f749b17063c012086c3799d50fbbbd0ca0af59a66791b567b9b74ab4b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:15 localhost podman[323869]: 2025-11-23 10:02:15.858415939 +0000 UTC m=+0.167891913 container init 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:15 localhost dnsmasq[323920]: started, version 2.85 cachesize 150 Nov 23 05:02:15 localhost dnsmasq[323920]: DNS service limited to local subnets Nov 23 05:02:15 localhost dnsmasq[323920]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:15 localhost dnsmasq[323920]: warning: no upstream servers configured Nov 23 05:02:15 localhost dnsmasq-dhcp[323920]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:02:15 localhost dnsmasq-dhcp[323920]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:15 localhost dnsmasq[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:15 localhost dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:15 localhost dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:15 localhost podman[323885]: 2025-11-23 10:02:15.888691152 +0000 UTC m=+0.061344461 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 05:02:15 localhost podman[323885]: 2025-11-23 10:02:15.900223887 +0000 UTC m=+0.072877216 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Nov 23 05:02:15 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:02:15 localhost podman[323884]: 2025-11-23 10:02:15.948300968 +0000 UTC m=+0.123863836 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:02:15 localhost podman[323884]: 2025-11-23 10:02:15.977926421 +0000 UTC m=+0.153489319 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 23 05:02:15 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:02:15 localhost podman[323883]: 2025-11-23 10:02:15.991663854 +0000 UTC m=+0.172063862 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:16 localhost podman[323869]: 2025-11-23 10:02:16.018173091 +0000 UTC m=+0.327649095 container start 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:16 localhost podman[323883]: 2025-11-23 10:02:16.037434464 +0000 UTC m=+0.217834502 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:16 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:02:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.083 263258 INFO neutron.agent.dhcp.agent [None req-a42b5eb9-623e-47d5-ba45-0dfad10fa636 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:13Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=aeaf5260-7927-48ee-b05a-2716bc3b0e31, ip_allocation=immediate, mac_address=fa:16:3e:90:84:2b, name=tempest-NetworksTestDHCPv6-1598434446, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['bb1da575-b028-485d-9f9a-eeac7df574a4', 'bf360b77-a47e-40bd-a4e7-951f05df64e6'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:13Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1805, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:13Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:02:16 localhost podman[323967]: 2025-11-23 10:02:16.271137074 +0000 UTC m=+0.047215586 container kill 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:02:16 localhost dnsmasq[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:02:16 localhost dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:16 localhost dnsmasq-dhcp[323920]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.391 263258 INFO neutron.agent.dhcp.agent [None req-5e056cbf-ae92-4a97-94fc-0533aae81090 - - - - - -] DHCP configuration for ports {'c97f425d-0e2f-4212-85de-246b74cc4178', 'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.513 263258 INFO neutron.agent.linux.ip_lib [None req-24be83e4-27be-47bf-b575-0cc802bea2ef - - - - - -] Device taped2c180f-1a cannot be used as it has no MAC address#033[00m Nov 23 05:02:16 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:16.522 263258 INFO neutron.agent.dhcp.agent [None req-0fd5e769-1010-4017-b655-6bd1904ab3a2 - - - - - -] DHCP configuration for ports {'aeaf5260-7927-48ee-b05a-2716bc3b0e31'} is completed#033[00m Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost kernel: device taped2c180f-1a entered promiscuous mode Nov 23 05:02:16 localhost NetworkManager[5975]: [1763892136.5470] manager: (taped2c180f-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00329|binding|INFO|Claiming lport ed2c180f-1a61-4a88-a761-adcb953abd22 for this chassis. Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00330|binding|INFO|ed2c180f-1a61-4a88-a761-adcb953abd22: Claiming unknown Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.549 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost systemd-udevd[323999]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.557 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.569 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10f8dd7c838246c58f1d2c4efc771237', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd8c251-3bb7-4e15-9c8d-7fcd2e804fa5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ed2c180f-1a61-4a88-a761-adcb953abd22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.571 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ed2c180f-1a61-4a88-a761-adcb953abd22 in datapath 31b977a7-a37c-42ba-bed9-7b22959f6310 bound to our chassis#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.572 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 31b977a7-a37c-42ba-bed9-7b22959f6310 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.573 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ab29bcf2-a0f9-490c-8ce5-3c3d984df964]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00331|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 ovn-installed in OVS Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00332|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 up in Southbound Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.599 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.645 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost dnsmasq[323920]: exiting on receipt of SIGTERM Nov 23 05:02:16 localhost podman[324024]: 2025-11-23 10:02:16.734023703 +0000 UTC m=+0.056417159 container kill 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:02:16 localhost systemd[1]: libpod-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope: Deactivated successfully. Nov 23 05:02:16 localhost podman[324040]: 2025-11-23 10:02:16.814666587 +0000 UTC m=+0.068109729 container died 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:02:16 localhost podman[324040]: 2025-11-23 10:02:16.851008667 +0000 UTC m=+0.104451779 container cleanup 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:02:16 localhost systemd[1]: libpod-conmon-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2.scope: Deactivated successfully. Nov 23 05:02:16 localhost podman[324047]: 2025-11-23 10:02:16.903639638 +0000 UTC m=+0.143095609 container remove 23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00333|binding|INFO|Releasing lport c97f425d-0e2f-4212-85de-246b74cc4178 from this chassis (sb_readonly=0) Nov 23 05:02:16 localhost kernel: device tapc97f425d-0e left promiscuous mode Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.926 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost ovn_controller[154788]: 2025-11-23T10:02:16Z|00334|binding|INFO|Setting lport c97f425d-0e2f-4212-85de-246b74cc4178 down in Southbound Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.940 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe7a:311b/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c97f425d-0e2f-4212-85de-246b74cc4178) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.942 160439 INFO neutron.agent.ovn.metadata.agent [-] Port c97f425d-0e2f-4212-85de-246b74cc4178 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.945 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:16 localhost nova_compute[281952]: 2025-11-23 10:02:16.948 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:16 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:16.946 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9a07e739-126c-493c-a315-6c89e51a4b63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:17 localhost nova_compute[281952]: 2025-11-23 10:02:17.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:17 localhost nova_compute[281952]: 2025-11-23 10:02:17.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:17 localhost nova_compute[281952]: 2025-11-23 10:02:17.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:02:17 localhost systemd[1]: tmp-crun.HZ4mM1.mount: Deactivated successfully. Nov 23 05:02:17 localhost systemd[1]: var-lib-containers-storage-overlay-f6a260f749b17063c012086c3799d50fbbbd0ca0af59a66791b567b9b74ab4b9-merged.mount: Deactivated successfully. Nov 23 05:02:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23ecc05a408547bebe6dda988accf070b5267d87b48036856c54a7dd91d1c5f2-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:17 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:02:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e117 e117: 6 total, 6 up, 6 in Nov 23 05:02:17 localhost podman[324114]: Nov 23 05:02:17 localhost podman[324114]: 2025-11-23 10:02:17.667435637 +0000 UTC m=+0.090570401 container create c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:02:17 localhost podman[324114]: 2025-11-23 10:02:17.625049592 +0000 UTC m=+0.048184356 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:17 localhost systemd[1]: Started libpod-conmon-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope. Nov 23 05:02:17 localhost systemd[1]: Started libcrun container. Nov 23 05:02:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d30fb1fb9da3ec81fc09dcff1705485fc67933914b93a576884d8b0f75a2cf79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:17 localhost podman[324114]: 2025-11-23 10:02:17.757435849 +0000 UTC m=+0.180570573 container init c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:02:17 localhost podman[324114]: 2025-11-23 10:02:17.766426936 +0000 UTC m=+0.189561660 container start c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:17 localhost dnsmasq[324132]: started, version 2.85 cachesize 150 Nov 23 05:02:17 localhost dnsmasq[324132]: DNS service limited to local subnets Nov 23 05:02:17 localhost dnsmasq[324132]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:17 localhost dnsmasq[324132]: warning: no upstream servers configured Nov 23 05:02:17 localhost dnsmasq-dhcp[324132]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:02:17 localhost dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 0 addresses Nov 23 05:02:17 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host Nov 23 05:02:17 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts Nov 23 05:02:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:17.901 263258 INFO neutron.agent.dhcp.agent [None req-b34f390c-0a7a-43ec-a164-5558df2022e3 - - - - - -] DHCP configuration for ports {'3aead29e-e222-4eec-a2bd-bde9a205e26f'} is completed#033[00m Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:18 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:18.304 263258 INFO neutron.agent.linux.ip_lib [None req-6dc1892b-fcee-480c-a98c-87ce793d8969 - - - - - -] Device tapd4981345-81 cannot be used as it has no MAC address#033[00m Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.362 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:18 localhost kernel: device tapd4981345-81 entered promiscuous mode Nov 23 05:02:18 localhost NetworkManager[5975]: [1763892138.3658] manager: (tapd4981345-81): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Nov 23 05:02:18 localhost ovn_controller[154788]: 2025-11-23T10:02:18Z|00335|binding|INFO|Claiming lport d4981345-81c9-4678-a4d9-29762c427058 for this chassis. Nov 23 05:02:18 localhost ovn_controller[154788]: 2025-11-23T10:02:18Z|00336|binding|INFO|d4981345-81c9-4678-a4d9-29762c427058: Claiming unknown Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.366 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:18 localhost systemd-udevd[324004]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:18.376 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:18 localhost ovn_controller[154788]: 2025-11-23T10:02:18Z|00337|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 ovn-installed in OVS Nov 23 05:02:18 localhost ovn_controller[154788]: 2025-11-23T10:02:18Z|00338|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 up in Southbound Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.382 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:18.378 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:02:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:18.382 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 39a83a1b-6025-4506-bc53-55b3409a5751 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:02:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:18.382 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:18.383 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef6011e-c209-46ca-8d1e-31845a2c6ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.446 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:18 localhost nova_compute[281952]: 2025-11-23 10:02:18.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e118 e118: 6 total, 6 up, 6 in Nov 23 05:02:19 localhost nova_compute[281952]: 2025-11-23 10:02:19.685 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:19 localhost podman[324195]: Nov 23 05:02:19 localhost podman[324195]: 2025-11-23 10:02:19.917182632 +0000 UTC m=+0.089812018 container create a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:19 localhost systemd[1]: Started libpod-conmon-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope. Nov 23 05:02:19 localhost podman[324195]: 2025-11-23 10:02:19.873063202 +0000 UTC m=+0.045692608 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:19 localhost systemd[1]: Started libcrun container. Nov 23 05:02:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e33c56067c23739cb4aa49806f9afa9d048c85db65aa7d7ecb65ed0e58e4d8d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:19 localhost podman[324195]: 2025-11-23 10:02:19.989065887 +0000 UTC m=+0.161695263 container init a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:02:19 localhost podman[324195]: 2025-11-23 10:02:19.996696051 +0000 UTC m=+0.169325437 container start a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:02:20 localhost dnsmasq[324213]: started, version 2.85 cachesize 150 Nov 23 05:02:20 localhost dnsmasq[324213]: DNS service limited to local subnets Nov 23 05:02:20 localhost dnsmasq[324213]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:20 localhost dnsmasq[324213]: warning: no upstream servers configured Nov 23 05:02:20 localhost dnsmasq-dhcp[324213]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:20 localhost dnsmasq[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:20 localhost dnsmasq-dhcp[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:20 localhost dnsmasq-dhcp[324213]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost kernel: device tapd4981345-81 left promiscuous mode Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00339|binding|INFO|Releasing lport d4981345-81c9-4678-a4d9-29762c427058 from this chassis (sb_readonly=0) Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00340|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 down in Southbound Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.088 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.090 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.093 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.093 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d8489335-1274-4f3d-8f15-a7cb5a866243]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.186 263258 INFO neutron.agent.dhcp.agent [None req-86bfa4b5-a3d3-49f7-8528-1648cb586bcd - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5'} is completed#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.236 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.237 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.238 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:02:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.362 263258 INFO neutron.agent.linux.ip_lib [None req-918d971d-3b75-4a67-bf2f-71e45f4e5fd6 - - - - - -] Device tapb261f9c2-83 cannot be used as it has no MAC address#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost kernel: device tapb261f9c2-83 entered promiscuous mode Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00341|binding|INFO|Claiming lport b261f9c2-83e7-4a95-9c5e-261d26463cae for this chassis. Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00342|binding|INFO|b261f9c2-83e7-4a95-9c5e-261d26463cae: Claiming unknown Nov 23 05:02:20 localhost NetworkManager[5975]: [1763892140.3973] manager: (tapb261f9c2-83): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.418 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe6b:a1ef/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a73b2c2-bad6-4a3d-a15f-3ef982ba513d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b261f9c2-83e7-4a95-9c5e-261d26463cae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e119 e119: 6 total, 6 up, 6 in Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.421 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b261f9c2-83e7-4a95-9c5e-261d26463cae in datapath 6957ce07-4d9c-4d1d-a573-5f6518e6601d bound to our chassis#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.423 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6957ce07-4d9c-4d1d-a573-5f6518e6601d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.424 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0e12f8-1003-4282-9f65-bd4587511dc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.432 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8:0:1:f816:3eff:feec:8f43'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e41a192c-bec5-4e3b-8388-8af6ab7114b5) old=Port_Binding(mac=['fa:16:3e:ec:8f:43 2001:db8::f816:3eff:feec:8f43'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feec:8f43/64', 'neutron:device_id': 'ovnmeta-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00343|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae ovn-installed in OVS Nov 23 05:02:20 localhost ovn_controller[154788]: 2025-11-23T10:02:20Z|00344|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae up in Southbound Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.437 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e41a192c-bec5-4e3b-8388-8af6ab7114b5 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 updated#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.443 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:20.445 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9add3d-4c1f-4ed0-886e-f5ba99adb27b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.483 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.506 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:20.576 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:20Z, description=, device_id=6c96453a-c777-4431-b133-5c4197796c3e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ef47ee52-a1a9-471d-b433-3cfaa989690c, ip_allocation=immediate, mac_address=fa:16:3e:a6:dd:58, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:14Z, description=, dns_domain=, id=31b977a7-a37c-42ba-bed9-7b22959f6310, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1570024778-network, port_security_enabled=True, project_id=10f8dd7c838246c58f1d2c4efc771237, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1812, status=ACTIVE, subnets=['ef80155c-ac2e-4709-9882-d7bc38017108'], tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:15Z, vlan_transparent=None, network_id=31b977a7-a37c-42ba-bed9-7b22959f6310, port_security_enabled=False, project_id=10f8dd7c838246c58f1d2c4efc771237, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1857, status=DOWN, tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:20Z on network 31b977a7-a37c-42ba-bed9-7b22959f6310#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.757 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:02:20 localhost dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 1 addresses Nov 23 05:02:20 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host Nov 23 05:02:20 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts Nov 23 05:02:20 localhost podman[324286]: 2025-11-23 10:02:20.820296552 +0000 UTC m=+0.039096825 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.828 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:02:20 localhost nova_compute[281952]: 2025-11-23 10:02:20.829 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:02:20 localhost dnsmasq[324213]: exiting on receipt of SIGTERM Nov 23 05:02:20 localhost podman[324323]: 2025-11-23 10:02:20.96598109 +0000 UTC m=+0.052933820 container kill a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:02:20 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:20.965 2 INFO neutron.agent.securitygroups_rpc [None req-da0f81f0-068a-49d7-b6f1-50fa28f5c3fa 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:20 localhost systemd[1]: libpod-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope: Deactivated successfully. Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.011 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:02:21 localhost podman[324343]: 2025-11-23 10:02:21.012422531 +0000 UTC m=+0.037333160 container died a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.012 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11186MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.013 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.013 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:02:21 localhost podman[324343]: 2025-11-23 10:02:21.037885886 +0000 UTC m=+0.062796485 container cleanup a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:02:21 localhost systemd[1]: libpod-conmon-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc.scope: Deactivated successfully. Nov 23 05:02:21 localhost podman[324350]: 2025-11-23 10:02:21.08150646 +0000 UTC m=+0.098327360 container remove a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.084 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:02:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.112 263258 INFO neutron.agent.linux.ip_lib [None req-5de1e31a-d4a1-4578-b4dd-c1804f8c816f - - - - - -] Device tapd4981345-81 cannot be used as it has no MAC address#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.125 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:02:21 localhost kernel: device tapd4981345-81 entered promiscuous mode Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.158 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost NetworkManager[5975]: [1763892141.1618] manager: (tapd4981345-81): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Nov 23 05:02:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.161 263258 INFO neutron.agent.dhcp.agent [None req-27e88670-3a53-453b-aa97-bd110b992645 - - - - - -] DHCP configuration for ports {'ef47ee52-a1a9-471d-b433-3cfaa989690c'} is completed#033[00m Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00345|binding|INFO|Claiming lport d4981345-81c9-4678-a4d9-29762c427058 for this chassis. Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00346|binding|INFO|d4981345-81c9-4678-a4d9-29762c427058: Claiming unknown Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.164 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00347|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 ovn-installed in OVS Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.175 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00348|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 up in Southbound Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.177 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.180 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 bound to our chassis#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.183 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 39a83a1b-6025-4506-bc53-55b3409a5751 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.184 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.184 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[09e1a73a-67db-4c9c-9285-5d72a87a4b59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.203 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.244 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.278 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost podman[324430]: Nov 23 05:02:21 localhost podman[324430]: 2025-11-23 10:02:21.368556822 +0000 UTC m=+0.084749701 container create 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:21 localhost systemd[1]: Started libpod-conmon-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope. Nov 23 05:02:21 localhost systemd[1]: Started libcrun container. Nov 23 05:02:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/520562ffe34534266c987bb9177288c00e8eaa7bcdaa0866bacb0e9fffd889e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:21 localhost podman[324430]: 2025-11-23 10:02:21.430866921 +0000 UTC m=+0.147059790 container init 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:02:21 localhost podman[324430]: 2025-11-23 10:02:21.332339826 +0000 UTC m=+0.048532675 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:21 localhost dnsmasq[324458]: started, version 2.85 cachesize 150 Nov 23 05:02:21 localhost dnsmasq[324458]: DNS service limited to local subnets Nov 23 05:02:21 localhost dnsmasq[324458]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:21 localhost dnsmasq[324458]: warning: no upstream servers configured Nov 23 05:02:21 localhost dnsmasq[324458]: read /var/lib/neutron/dhcp/6957ce07-4d9c-4d1d-a573-5f6518e6601d/addn_hosts - 0 addresses Nov 23 05:02:21 localhost podman[324430]: 2025-11-23 10:02:21.445284356 +0000 UTC m=+0.161477225 container start 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:21 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:21.476 2 INFO neutron.agent.securitygroups_rpc [None req-2c014fe0-b7cd-43b0-aa6e-452db82dfd05 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:21 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:02:21 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4263813190' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:02:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.522 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.537 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.545 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.561 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.564 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.567 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.567 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:02:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.577 263258 INFO neutron.agent.dhcp.agent [None req-66ce061d-76bb-48b7-affd-be9a854de33f - - - - - -] DHCP configuration for ports {'f252ff88-57dd-44b0-956a-6448a32b09e9'} is completed#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.600 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:21.613 263258 INFO neutron.agent.linux.ip_lib [None req-ef2c9ea9-e2d8-4157-8991-9b32ede258bf - - - - - -] Device tapf463970f-33 cannot be used as it has no MAC address#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.644 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost kernel: device tapf463970f-33 entered promiscuous mode Nov 23 05:02:21 localhost NetworkManager[5975]: [1763892141.6527] manager: (tapf463970f-33): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00349|binding|INFO|Claiming lport f463970f-3381-45b3-96e6-35969693be91 for this chassis. Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00350|binding|INFO|f463970f-3381-45b3-96e6-35969693be91: Claiming unknown Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.667 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bcc515473444ea195be635c77c65d0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11a468ee-c41b-44bf-983d-07802e5afd68, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f463970f-3381-45b3-96e6-35969693be91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.670 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f463970f-3381-45b3-96e6-35969693be91 in datapath e77a4286-3801-4220-989b-d56ef685e3b6 bound to our chassis#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.672 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e77a4286-3801-4220-989b-d56ef685e3b6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.673 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f5a266-f793-471f-8399-ec1a246e7fe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00351|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 ovn-installed in OVS Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00352|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 up in Southbound Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.697 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.740 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost dnsmasq[324458]: exiting on receipt of SIGTERM Nov 23 05:02:21 localhost podman[324498]: 2025-11-23 10:02:21.750380824 +0000 UTC m=+0.059427031 container kill 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:02:21 localhost systemd[1]: libpod-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope: Deactivated successfully. Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.766 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost podman[324519]: 2025-11-23 10:02:21.804596355 +0000 UTC m=+0.041379166 container died 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:02:21 localhost podman[324519]: 2025-11-23 10:02:21.825823428 +0000 UTC m=+0.062606219 container cleanup 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:02:21 localhost systemd[1]: libpod-conmon-13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9.scope: Deactivated successfully. Nov 23 05:02:21 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:21.896 2 INFO neutron.agent.securitygroups_rpc [None req-7716ba44-8c0f-4db3-b436-de264dd9940d 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:21 localhost podman[324521]: 2025-11-23 10:02:21.904230824 +0000 UTC m=+0.128361975 container remove 13f752cfb641d6b605222415b721540232a8c3d9ec729910f7d8e78d2c4a6ec9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6957ce07-4d9c-4d1d-a573-5f6518e6601d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00353|binding|INFO|Releasing lport b261f9c2-83e7-4a95-9c5e-261d26463cae from this chassis (sb_readonly=0) Nov 23 05:02:21 localhost ovn_controller[154788]: 2025-11-23T10:02:21Z|00354|binding|INFO|Setting lport b261f9c2-83e7-4a95-9c5e-261d26463cae down in Southbound Nov 23 05:02:21 localhost kernel: device tapb261f9c2-83 left promiscuous mode Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.914 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:21 localhost systemd[1]: var-lib-containers-storage-overlay-e33c56067c23739cb4aa49806f9afa9d048c85db65aa7d7ecb65ed0e58e4d8d9-merged.mount: Deactivated successfully. Nov 23 05:02:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a02ed1d514645e5946fb1e55449705a404bc51e494bacee014ca286cc5a9b4bc-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.924 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe6b:a1ef/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6957ce07-4d9c-4d1d-a573-5f6518e6601d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a73b2c2-bad6-4a3d-a15f-3ef982ba513d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b261f9c2-83e7-4a95-9c5e-261d26463cae) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.928 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b261f9c2-83e7-4a95-9c5e-261d26463cae in datapath 6957ce07-4d9c-4d1d-a573-5f6518e6601d unbound from our chassis#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.929 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6957ce07-4d9c-4d1d-a573-5f6518e6601d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:21 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:21.930 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[17d395e2-30e0-405e-870f-2cea24bddd0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:21 localhost nova_compute[281952]: 2025-11-23 10:02:21.933 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:22 localhost podman[324574]: Nov 23 05:02:22 localhost podman[324574]: 2025-11-23 10:02:22.098568391 +0000 UTC m=+0.091572622 container create 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:02:22 localhost systemd[1]: Started libpod-conmon-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope. Nov 23 05:02:22 localhost podman[324574]: 2025-11-23 10:02:22.045989601 +0000 UTC m=+0.038993852 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:22 localhost systemd[1]: Started libcrun container. Nov 23 05:02:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/600a00c93f290b4d32413ef130a9b7a40a8e3f96000c136065f555c66a3949e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:22 localhost podman[324574]: 2025-11-23 10:02:22.174738567 +0000 UTC m=+0.167742798 container init 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:02:22 localhost podman[324574]: 2025-11-23 10:02:22.183850858 +0000 UTC m=+0.176855079 container start 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:02:22 localhost dnsmasq[324603]: started, version 2.85 cachesize 150 Nov 23 05:02:22 localhost dnsmasq[324603]: DNS service limited to local subnets Nov 23 05:02:22 localhost dnsmasq[324603]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:22 localhost dnsmasq[324603]: warning: no upstream servers configured Nov 23 05:02:22 localhost dnsmasq-dhcp[324603]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:22 localhost dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:22 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:22 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.189 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.190 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.191 263258 INFO neutron.agent.dhcp.agent [None req-6d34abed-43fb-4b42-9a1c-904f2cac2ceb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.234 263258 INFO neutron.agent.dhcp.agent [None req-5de1e31a-d4a1-4578-b4dd-c1804f8c816f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:21Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=cd81cb44-73bf-4b94-82f0-6b35c126ee07, ip_allocation=immediate, mac_address=fa:16:3e:aa:26:53, name=tempest-NetworksTestDHCPv6-1288803817, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['2fe6c6ad-ce38-42e7-bef0-0b0aa5fbe80c', '8eea5134-173e-4854-98d8-93b4b79ba9cd'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:17Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1866, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:21Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:02:22 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:22.262 2 INFO neutron.agent.securitygroups_rpc [None req-2377c616-101f-418f-a8e9-4c617ed1b658 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.411 263258 INFO neutron.agent.dhcp.agent [None req-2a64f720-2fc6-49b7-a426-43cdb09c966f - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed#033[00m Nov 23 05:02:22 localhost dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:02:22 localhost podman[324626]: 2025-11-23 10:02:22.421489298 +0000 UTC m=+0.064609771 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:22 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:22 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:22 localhost nova_compute[281952]: 2025-11-23 10:02:22.568 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:02:22 localhost podman[324666]: Nov 23 05:02:22 localhost podman[324666]: 2025-11-23 10:02:22.634407377 +0000 UTC m=+0.073193325 container create e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.652 263258 INFO neutron.agent.dhcp.agent [None req-c9769175-7ee6-4cb3-b783-ef98c873929e - - - - - -] DHCP configuration for ports {'cd81cb44-73bf-4b94-82f0-6b35c126ee07'} is completed#033[00m Nov 23 05:02:22 localhost ovn_controller[154788]: 2025-11-23T10:02:22Z|00355|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:22 localhost systemd[1]: Started libpod-conmon-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope. Nov 23 05:02:22 localhost systemd[1]: Started libcrun container. Nov 23 05:02:22 localhost nova_compute[281952]: 2025-11-23 10:02:22.692 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27046c8dc740f22d06124834ed035857bd6d551c21ad129433b752c7a3c7ddc6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:22 localhost podman[324666]: 2025-11-23 10:02:22.603439353 +0000 UTC m=+0.042225281 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:22 localhost podman[324666]: 2025-11-23 10:02:22.705715063 +0000 UTC m=+0.144501001 container init e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:02:22 localhost podman[324666]: 2025-11-23 10:02:22.713971898 +0000 UTC m=+0.152757836 container start e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:02:22 localhost dnsmasq[324687]: started, version 2.85 cachesize 150 Nov 23 05:02:22 localhost dnsmasq[324687]: DNS service limited to local subnets Nov 23 05:02:22 localhost dnsmasq[324687]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:22 localhost dnsmasq[324687]: warning: no upstream servers configured Nov 23 05:02:22 localhost dnsmasq-dhcp[324687]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:02:22 localhost dnsmasq[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/addn_hosts - 0 addresses Nov 23 05:02:22 localhost dnsmasq-dhcp[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/host Nov 23 05:02:22 localhost dnsmasq-dhcp[324687]: read /var/lib/neutron/dhcp/e77a4286-3801-4220-989b-d56ef685e3b6/opts Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.789 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:20Z, description=, device_id=6c96453a-c777-4431-b133-5c4197796c3e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ef47ee52-a1a9-471d-b433-3cfaa989690c, ip_allocation=immediate, mac_address=fa:16:3e:a6:dd:58, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:14Z, description=, dns_domain=, id=31b977a7-a37c-42ba-bed9-7b22959f6310, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1570024778-network, port_security_enabled=True, project_id=10f8dd7c838246c58f1d2c4efc771237, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10715, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1812, status=ACTIVE, subnets=['ef80155c-ac2e-4709-9882-d7bc38017108'], tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:15Z, vlan_transparent=None, network_id=31b977a7-a37c-42ba-bed9-7b22959f6310, port_security_enabled=False, project_id=10f8dd7c838246c58f1d2c4efc771237, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1857, status=DOWN, tags=[], tenant_id=10f8dd7c838246c58f1d2c4efc771237, updated_at=2025-11-23T10:02:20Z on network 31b977a7-a37c-42ba-bed9-7b22959f6310#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.811 263258 INFO neutron.agent.dhcp.agent [None req-d56509cd-7d81-4efc-9b74-f76c5145f3ee - - - - - -] DHCP configuration for ports {'87b1e486-66c6-450b-a88a-205b42e4c756'} is completed#033[00m Nov 23 05:02:22 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:22.852 2 INFO neutron.agent.securitygroups_rpc [None req-5aab4fba-63d5-4959-8f5e-411b7878d60b 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:22.910 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:22 localhost systemd[1]: run-netns-qdhcp\x2d6957ce07\x2d4d9c\x2d4d1d\x2da573\x2d5f6518e6601d.mount: Deactivated successfully. Nov 23 05:02:22 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:22.951 2 INFO neutron.agent.securitygroups_rpc [None req-7111b2ef-5dc8-48f5-9c8c-9b4c6e4004a1 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:23 localhost podman[324719]: 2025-11-23 10:02:23.074220446 +0000 UTC m=+0.070541394 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:23 localhost dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 1 addresses Nov 23 05:02:23 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host Nov 23 05:02:23 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts Nov 23 05:02:23 localhost podman[324730]: 2025-11-23 10:02:23.092655774 +0000 UTC m=+0.061695871 container kill e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:23 localhost dnsmasq[324687]: exiting on receipt of SIGTERM Nov 23 05:02:23 localhost systemd[1]: libpod-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope: Deactivated successfully. Nov 23 05:02:23 localhost podman[324762]: 2025-11-23 10:02:23.172572776 +0000 UTC m=+0.056913894 container died e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:23 localhost podman[324762]: 2025-11-23 10:02:23.216614223 +0000 UTC m=+0.100955261 container remove e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e77a4286-3801-4220-989b-d56ef685e3b6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:23 localhost nova_compute[281952]: 2025-11-23 10:02:23.230 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:23 localhost kernel: device tapf463970f-33 left promiscuous mode Nov 23 05:02:23 localhost ovn_controller[154788]: 2025-11-23T10:02:23Z|00356|binding|INFO|Releasing lport f463970f-3381-45b3-96e6-35969693be91 from this chassis (sb_readonly=0) Nov 23 05:02:23 localhost ovn_controller[154788]: 2025-11-23T10:02:23Z|00357|binding|INFO|Setting lport f463970f-3381-45b3-96e6-35969693be91 down in Southbound Nov 23 05:02:23 localhost dnsmasq[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:23 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:23 localhost dnsmasq-dhcp[324603]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:23 localhost podman[324787]: 2025-11-23 10:02:23.237371522 +0000 UTC m=+0.069787761 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:23.238 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e77a4286-3801-4220-989b-d56ef685e3b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bcc515473444ea195be635c77c65d0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11a468ee-c41b-44bf-983d-07802e5afd68, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f463970f-3381-45b3-96e6-35969693be91) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:23.240 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f463970f-3381-45b3-96e6-35969693be91 in datapath e77a4286-3801-4220-989b-d56ef685e3b6 unbound from our chassis#033[00m Nov 23 05:02:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:23.240 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e77a4286-3801-4220-989b-d56ef685e3b6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:23.241 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[31f1a420-1643-4270-ac13-ff327bc306bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:23 localhost nova_compute[281952]: 2025-11-23 10:02:23.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:23 localhost systemd[1]: libpod-conmon-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3.scope: Deactivated successfully. Nov 23 05:02:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.410 263258 INFO neutron.agent.dhcp.agent [None req-1eec1afe-c435-4703-85b6-9fd8f2098226 - - - - - -] DHCP configuration for ports {'ef47ee52-a1a9-471d-b433-3cfaa989690c'} is completed#033[00m Nov 23 05:02:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.458 263258 INFO neutron.agent.dhcp.agent [None req-5cf4e7f9-7943-4d45-aa17-9727e51f6d3b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.459 263258 INFO neutron.agent.dhcp.agent [None req-5cf4e7f9-7943-4d45-aa17-9727e51f6d3b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:23 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:23.606 2 INFO neutron.agent.securitygroups_rpc [None req-192c090e-5c5a-4cbc-acb5-d1edd0c7e4bb 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:23 localhost systemd[1]: var-lib-containers-storage-overlay-27046c8dc740f22d06124834ed035857bd6d551c21ad129433b752c7a3c7ddc6-merged.mount: Deactivated successfully. Nov 23 05:02:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8b866f45361e673e86eb990935960557df6e2e390139f88d7cdbb7a68d726a3-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:23 localhost systemd[1]: run-netns-qdhcp\x2de77a4286\x2d3801\x2d4220\x2d989b\x2dd56ef685e3b6.mount: Deactivated successfully. Nov 23 05:02:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:23.975 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:24 localhost ovn_controller[154788]: 2025-11-23T10:02:24Z|00358|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:24 localhost nova_compute[281952]: 2025-11-23 10:02:24.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:24 localhost dnsmasq[324603]: exiting on receipt of SIGTERM Nov 23 05:02:24 localhost podman[324834]: 2025-11-23 10:02:24.2030589 +0000 UTC m=+0.058020068 container kill 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:24 localhost systemd[1]: libpod-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope: Deactivated successfully. Nov 23 05:02:24 localhost podman[324846]: 2025-11-23 10:02:24.274506131 +0000 UTC m=+0.059743711 container died 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:02:24 localhost systemd[1]: tmp-crun.AWtw87.mount: Deactivated successfully. Nov 23 05:02:24 localhost podman[324846]: 2025-11-23 10:02:24.315292948 +0000 UTC m=+0.100530488 container cleanup 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:24 localhost systemd[1]: libpod-conmon-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624.scope: Deactivated successfully. Nov 23 05:02:24 localhost podman[324851]: 2025-11-23 10:02:24.350946306 +0000 UTC m=+0.125813866 container remove 9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:02:24 localhost systemd[1]: var-lib-containers-storage-overlay-600a00c93f290b4d32413ef130a9b7a40a8e3f96000c136065f555c66a3949e5-merged.mount: Deactivated successfully. Nov 23 05:02:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9dc4e156f4d95b9d7f202f40a6a870d6edb983d4487a31030be3a9ac94a35624-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e120 e120: 6 total, 6 up, 6 in Nov 23 05:02:25 localhost podman[324927]: Nov 23 05:02:25 localhost podman[324927]: 2025-11-23 10:02:25.239509859 +0000 UTC m=+0.101288381 container create b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:02:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:02:25 localhost systemd[1]: Started libpod-conmon-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope. Nov 23 05:02:25 localhost systemd[1]: Started libcrun container. Nov 23 05:02:25 localhost podman[324927]: 2025-11-23 10:02:25.192835461 +0000 UTC m=+0.054614023 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8614f07940d831ba536275ae2fdfc52851f2aedbb4cf8d25398005eb1e57659f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:25 localhost podman[324927]: 2025-11-23 10:02:25.300589021 +0000 UTC m=+0.162367563 container init b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:25 localhost podman[324927]: 2025-11-23 10:02:25.310154515 +0000 UTC m=+0.171933067 container start b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:25 localhost dnsmasq[324964]: started, version 2.85 cachesize 150 Nov 23 05:02:25 localhost dnsmasq[324964]: DNS service limited to local subnets Nov 23 05:02:25 localhost dnsmasq[324964]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:25 localhost dnsmasq[324964]: warning: no upstream servers configured Nov 23 05:02:25 localhost dnsmasq[324964]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:25 localhost podman[324942]: 2025-11-23 10:02:25.353962925 +0000 UTC m=+0.075291451 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible) Nov 23 05:02:25 localhost podman[324942]: 2025-11-23 10:02:25.361460705 +0000 UTC m=+0.082789231 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:25 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:02:25 localhost podman[324943]: 2025-11-23 10:02:25.404730869 +0000 UTC m=+0.120318678 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:02:25 localhost podman[324943]: 2025-11-23 10:02:25.4119381 +0000 UTC m=+0.127525929 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:02:25 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:02:25 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:25.606 263258 INFO neutron.agent.dhcp.agent [None req-90741188-f0e1-4c52-ba7f-d1d78c99dd0a - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed#033[00m Nov 23 05:02:25 localhost dnsmasq[324964]: exiting on receipt of SIGTERM Nov 23 05:02:25 localhost podman[325001]: 2025-11-23 10:02:25.709984092 +0000 UTC m=+0.063536179 container kill b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:02:25 localhost systemd[1]: libpod-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope: Deactivated successfully. Nov 23 05:02:25 localhost podman[325014]: 2025-11-23 10:02:25.779684769 +0000 UTC m=+0.054799089 container died b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:02:25 localhost podman[325014]: 2025-11-23 10:02:25.810845999 +0000 UTC m=+0.085960279 container cleanup b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:02:25 localhost systemd[1]: libpod-conmon-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d.scope: Deactivated successfully. Nov 23 05:02:25 localhost podman[325015]: 2025-11-23 10:02:25.866155883 +0000 UTC m=+0.132361139 container remove b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:25 localhost systemd[1]: var-lib-containers-storage-overlay-8614f07940d831ba536275ae2fdfc52851f2aedbb4cf8d25398005eb1e57659f-merged.mount: Deactivated successfully. Nov 23 05:02:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7f48189aefc4525956b1fdccc51e4a8f1f1e0504a54c53413e5573356fcf77d-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e121 e121: 6 total, 6 up, 6 in Nov 23 05:02:26 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:26.346 2 INFO neutron.agent.securitygroups_rpc [None req-d166c481-1e29-4957-bae4-8d46f816d4e6 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:26.381 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:26 localhost nova_compute[281952]: 2025-11-23 10:02:26.564 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:26 localhost nova_compute[281952]: 2025-11-23 10:02:26.602 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e122 e122: 6 total, 6 up, 6 in Nov 23 05:02:27 localhost podman[325094]: Nov 23 05:02:27 localhost podman[325094]: 2025-11-23 10:02:27.530768053 +0000 UTC m=+0.088808017 container create 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:02:27 localhost systemd[1]: Started libpod-conmon-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope. Nov 23 05:02:27 localhost systemd[1]: tmp-crun.fQpSvg.mount: Deactivated successfully. Nov 23 05:02:27 localhost podman[325094]: 2025-11-23 10:02:27.487259632 +0000 UTC m=+0.045299636 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:27 localhost systemd[1]: Started libcrun container. Nov 23 05:02:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bf7835d0e90b38389a369dac93f71b03b61e893f5c0cbb4569c76a06c953709/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:27 localhost podman[325094]: 2025-11-23 10:02:27.615671488 +0000 UTC m=+0.173711452 container init 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:02:27 localhost podman[325094]: 2025-11-23 10:02:27.625749819 +0000 UTC m=+0.183789783 container start 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:02:27 localhost dnsmasq[325112]: started, version 2.85 cachesize 150 Nov 23 05:02:27 localhost dnsmasq[325112]: DNS service limited to local subnets Nov 23 05:02:27 localhost dnsmasq[325112]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:27 localhost dnsmasq[325112]: warning: no upstream servers configured Nov 23 05:02:27 localhost dnsmasq-dhcp[325112]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:27 localhost dnsmasq[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:27 localhost dnsmasq-dhcp[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:27 localhost dnsmasq-dhcp[325112]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:27.868 263258 INFO neutron.agent.dhcp.agent [None req-b1486ae4-3c8b-4d00-a2a4-94af2405fc70 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed#033[00m Nov 23 05:02:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e123 e123: 6 total, 6 up, 6 in Nov 23 05:02:27 localhost dnsmasq[325112]: exiting on receipt of SIGTERM Nov 23 05:02:27 localhost podman[325130]: 2025-11-23 10:02:27.992379543 +0000 UTC m=+0.064715556 container kill 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:02:27 localhost systemd[1]: libpod-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope: Deactivated successfully. Nov 23 05:02:28 localhost podman[325144]: 2025-11-23 10:02:28.067672782 +0000 UTC m=+0.058959628 container died 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:28 localhost podman[325144]: 2025-11-23 10:02:28.098635146 +0000 UTC m=+0.089921942 container cleanup 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:28 localhost systemd[1]: libpod-conmon-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd.scope: Deactivated successfully. Nov 23 05:02:28 localhost podman[325145]: 2025-11-23 10:02:28.137216494 +0000 UTC m=+0.123856006 container remove 0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:28 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:28.418 2 INFO neutron.agent.securitygroups_rpc [None req-accb9679-f798-499a-bac7-ef6b44f5ac25 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:28 localhost systemd[1]: var-lib-containers-storage-overlay-0bf7835d0e90b38389a369dac93f71b03b61e893f5c0cbb4569c76a06c953709-merged.mount: Deactivated successfully. Nov 23 05:02:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ac37d25dd1df1f0b0a63d7e124c6126b6a242ecdf51dbb37e62ff18296178fd-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:28 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:28.920 2 INFO neutron.agent.securitygroups_rpc [None req-cba9b202-7a98-4e4d-911c-f57572c47e81 173936a8ad3f4f56a3e8a901d82c6886 2a12890982534f9f8dfb103ad294ca1f - - default default] Security group member updated ['23b6f795-fce0-46ba-a7cc-e8d055195822']#033[00m Nov 23 05:02:29 localhost podman[325223]: Nov 23 05:02:29 localhost podman[325223]: 2025-11-23 10:02:29.045603808 +0000 UTC m=+0.092910574 container create 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:29 localhost systemd[1]: Started libpod-conmon-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope. Nov 23 05:02:29 localhost systemd[1]: Started libcrun container. Nov 23 05:02:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/934bd1a7d08cf6a4a8ef94b1e4c9333eeb001c1a039964b7962f004e9e6810f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:29 localhost podman[325223]: 2025-11-23 10:02:29.002801089 +0000 UTC m=+0.050107885 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:29 localhost podman[325223]: 2025-11-23 10:02:29.106696149 +0000 UTC m=+0.154002925 container init 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:02:29 localhost podman[325223]: 2025-11-23 10:02:29.113051546 +0000 UTC m=+0.160358312 container start 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:29 localhost dnsmasq[325241]: started, version 2.85 cachesize 150 Nov 23 05:02:29 localhost dnsmasq[325241]: DNS service limited to local subnets Nov 23 05:02:29 localhost dnsmasq[325241]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:29 localhost dnsmasq[325241]: warning: no upstream servers configured Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:02:29 localhost dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.168 263258 INFO neutron.agent.dhcp.agent [None req-fd203e9b-1f2c-470a-be85-24176465be87 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:27Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a72e710a-e848-4366-84b4-63006f20c66f, ip_allocation=immediate, mac_address=fa:16:3e:d7:d9:99, name=tempest-NetworksTestDHCPv6-353818491, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T09:59:45Z, description=, dns_domain=, id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1593059932, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37403, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1104, status=ACTIVE, subnets=['274eb987-216a-4bde-ad9a-f70582643a22', 'b16500ec-aaf4-434d-abb8-bd690dfe4c20'], tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:26Z, vlan_transparent=None, network_id=6db04e65-3a65-4ecd-9d7c-1b518aa8c237, port_security_enabled=True, project_id=2a12890982534f9f8dfb103ad294ca1f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['23b6f795-fce0-46ba-a7cc-e8d055195822'], standard_attr_id=1900, status=DOWN, tags=[], tenant_id=2a12890982534f9f8dfb103ad294ca1f, updated_at=2025-11-23T10:02:28Z on network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237#033[00m Nov 23 05:02:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e124 e124: 6 total, 6 up, 6 in Nov 23 05:02:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.300 263258 INFO neutron.agent.dhcp.agent [None req-5c2b92a6-e799-4223-a264-0022711823cb - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed#033[00m Nov 23 05:02:29 localhost dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 2 addresses Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:29 localhost podman[325261]: 2025-11-23 10:02:29.3680232 +0000 UTC m=+0.059963689 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:02:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:29.621 263258 INFO neutron.agent.dhcp.agent [None req-54d3920b-5b80-4605-9c8d-480703b7e810 - - - - - -] DHCP configuration for ports {'a72e710a-e848-4366-84b4-63006f20c66f'} is completed#033[00m Nov 23 05:02:29 localhost dnsmasq[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:29 localhost podman[325300]: 2025-11-23 10:02:29.707309752 +0000 UTC m=+0.054695056 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:02:29 localhost dnsmasq-dhcp[325241]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:29 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:29.891 2 INFO neutron.agent.securitygroups_rpc [None req-71451af8-2b84-4ee8-885e-297c3854d4d1 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:29 localhost openstack_network_exporter[242668]: ERROR 10:02:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:02:29 localhost openstack_network_exporter[242668]: ERROR 10:02:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:02:29 localhost openstack_network_exporter[242668]: ERROR 10:02:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:02:29 localhost openstack_network_exporter[242668]: ERROR 10:02:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:02:29 localhost openstack_network_exporter[242668]: Nov 23 05:02:29 localhost openstack_network_exporter[242668]: ERROR 10:02:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:02:29 localhost openstack_network_exporter[242668]: Nov 23 05:02:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e125 e125: 6 total, 6 up, 6 in Nov 23 05:02:30 localhost systemd[1]: tmp-crun.05dwbq.mount: Deactivated successfully. Nov 23 05:02:30 localhost dnsmasq[325241]: exiting on receipt of SIGTERM Nov 23 05:02:30 localhost podman[325337]: 2025-11-23 10:02:30.314962181 +0000 UTC m=+0.072744232 container kill 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:30 localhost systemd[1]: libpod-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope: Deactivated successfully. Nov 23 05:02:30 localhost podman[325358]: 2025-11-23 10:02:30.399524656 +0000 UTC m=+0.061784125 container died 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:30 localhost podman[325358]: 2025-11-23 10:02:30.490367354 +0000 UTC m=+0.152626783 container remove 603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:30 localhost systemd[1]: libpod-conmon-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd.scope: Deactivated successfully. Nov 23 05:02:30 localhost systemd[1]: var-lib-containers-storage-overlay-934bd1a7d08cf6a4a8ef94b1e4c9333eeb001c1a039964b7962f004e9e6810f4-merged.mount: Deactivated successfully. Nov 23 05:02:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-603f4f502e6393a4aaa99441e6ab8846d2eb15dbb009979ea4157454e49963dd-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:30 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:30.790 2 INFO neutron.agent.securitygroups_rpc [None req-1ba66ce5-b42f-4fad-9876-b27f14456f6a 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:31 localhost podman[325430]: Nov 23 05:02:31 localhost podman[325430]: 2025-11-23 10:02:31.389958776 +0000 UTC m=+0.073652790 container create aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 05:02:31 localhost systemd[1]: Started libpod-conmon-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope. Nov 23 05:02:31 localhost systemd[1]: tmp-crun.0nAWDn.mount: Deactivated successfully. Nov 23 05:02:31 localhost systemd[1]: Started libcrun container. Nov 23 05:02:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd17c590309b55ead84c30c1c43935a15235ea5459b9de3ba9300a684a6178a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:31 localhost podman[325430]: 2025-11-23 10:02:31.455406183 +0000 UTC m=+0.139100197 container init aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 05:02:31 localhost podman[325430]: 2025-11-23 10:02:31.361352565 +0000 UTC m=+0.045046569 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:31 localhost podman[325430]: 2025-11-23 10:02:31.463834732 +0000 UTC m=+0.147528746 container start aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:02:31 localhost dnsmasq[325448]: started, version 2.85 cachesize 150 Nov 23 05:02:31 localhost dnsmasq[325448]: DNS service limited to local subnets Nov 23 05:02:31 localhost dnsmasq[325448]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:31 localhost dnsmasq[325448]: warning: no upstream servers configured Nov 23 05:02:31 localhost dnsmasq-dhcp[325448]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:31 localhost dnsmasq[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/addn_hosts - 0 addresses Nov 23 05:02:31 localhost dnsmasq-dhcp[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/host Nov 23 05:02:31 localhost dnsmasq-dhcp[325448]: read /var/lib/neutron/dhcp/6db04e65-3a65-4ecd-9d7c-1b518aa8c237/opts Nov 23 05:02:31 localhost nova_compute[281952]: 2025-11-23 10:02:31.568 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:31 localhost nova_compute[281952]: 2025-11-23 10:02:31.604 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:31.691 263258 INFO neutron.agent.dhcp.agent [None req-05320843-34ef-410e-9f64-81cc22df8da5 - - - - - -] DHCP configuration for ports {'e41a192c-bec5-4e3b-8388-8af6ab7114b5', 'd4981345-81c9-4678-a4d9-29762c427058'} is completed#033[00m Nov 23 05:02:31 localhost dnsmasq[325448]: exiting on receipt of SIGTERM Nov 23 05:02:31 localhost podman[325466]: 2025-11-23 10:02:31.789528285 +0000 UTC m=+0.037821415 container kill aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:31 localhost systemd[1]: libpod-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope: Deactivated successfully. Nov 23 05:02:31 localhost podman[325482]: 2025-11-23 10:02:31.842056374 +0000 UTC m=+0.038695944 container died aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:31 localhost systemd[1]: tmp-crun.ZQFgup.mount: Deactivated successfully. Nov 23 05:02:31 localhost podman[325482]: 2025-11-23 10:02:31.909403848 +0000 UTC m=+0.106043388 container remove aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6db04e65-3a65-4ecd-9d7c-1b518aa8c237, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:31 localhost ovn_controller[154788]: 2025-11-23T10:02:31Z|00359|binding|INFO|Releasing lport d4981345-81c9-4678-a4d9-29762c427058 from this chassis (sb_readonly=0) Nov 23 05:02:31 localhost kernel: device tapd4981345-81 left promiscuous mode Nov 23 05:02:31 localhost ovn_controller[154788]: 2025-11-23T10:02:31Z|00360|binding|INFO|Setting lport d4981345-81c9-4678-a4d9-29762c427058 down in Southbound Nov 23 05:02:31 localhost nova_compute[281952]: 2025-11-23 10:02:31.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:31.926 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fee0:b47d/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6db04e65-3a65-4ecd-9d7c-1b518aa8c237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a12890982534f9f8dfb103ad294ca1f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cb3a0b8-b546-4270-a81e-4e29b4aaeadf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d4981345-81c9-4678-a4d9-29762c427058) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:31.927 160439 INFO neutron.agent.ovn.metadata.agent [-] Port d4981345-81c9-4678-a4d9-29762c427058 in datapath 6db04e65-3a65-4ecd-9d7c-1b518aa8c237 unbound from our chassis#033[00m Nov 23 05:02:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:31.929 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6db04e65-3a65-4ecd-9d7c-1b518aa8c237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:02:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:31.930 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d56f822f-84bc-42b3-ad54-b0fc38cb3b96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:31 localhost nova_compute[281952]: 2025-11-23 10:02:31.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:31 localhost systemd[1]: libpod-conmon-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d.scope: Deactivated successfully. Nov 23 05:02:32 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:32.007 2 INFO neutron.agent.securitygroups_rpc [None req-fe8c6fcc-6e02-46f0-a4be-4363defcaae3 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:32 localhost sshd[325509]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:02:32 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:32.114 2 INFO neutron.agent.securitygroups_rpc [None req-5d5ccae7-720c-44b0-bb02-9faf01da722a 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']#033[00m Nov 23 05:02:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:32.255 263258 INFO neutron.agent.dhcp.agent [None req-e55fbbd9-0b6f-41b7-952b-ca167e6862e2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e126 e126: 6 total, 6 up, 6 in Nov 23 05:02:32 localhost systemd[1]: var-lib-containers-storage-overlay-dd17c590309b55ead84c30c1c43935a15235ea5459b9de3ba9300a684a6178a8-merged.mount: Deactivated successfully. Nov 23 05:02:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa36de6ff5e85b4add64d8095d6ff443ca8e755beb0d0a9bb66f27a82c4f080d-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:32 localhost systemd[1]: run-netns-qdhcp\x2d6db04e65\x2d3a65\x2d4ecd\x2d9d7c\x2d1b518aa8c237.mount: Deactivated successfully. Nov 23 05:02:32 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:32.696 2 INFO neutron.agent.securitygroups_rpc [None req-07562697-3af0-478b-ab22-4ac8b6836e01 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:33 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:33.363 2 INFO neutron.agent.securitygroups_rpc [None req-f8a04e39-0876-44f3-a37b-8d86b234eb13 268d02d4288b4ff3a9dab77419bcf96a 23ffb5a89d5d4d8a8900ea750309030f - - default default] Security group member updated ['8eb14703-b106-4f91-b864-8b16a806bee3']#033[00m Nov 23 05:02:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:33.558 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e127 e127: 6 total, 6 up, 6 in Nov 23 05:02:34 localhost ovn_controller[154788]: 2025-11-23T10:02:34Z|00361|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:34 localhost nova_compute[281952]: 2025-11-23 10:02:34.489 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e128 e128: 6 total, 6 up, 6 in Nov 23 05:02:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:35.392 263258 INFO neutron.agent.linux.ip_lib [None req-c5c7b97d-9dc9-444d-a85f-a4c6fc884a8f - - - - - -] Device tape9b240b4-dd cannot be used as it has no MAC address#033[00m Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.419 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost kernel: device tape9b240b4-dd entered promiscuous mode Nov 23 05:02:35 localhost NetworkManager[5975]: [1763892155.4272] manager: (tape9b240b4-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.428 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00362|binding|INFO|Claiming lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 for this chassis. Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00363|binding|INFO|e9b240b4-dda7-48fc-a63a-d3fd91217a97: Claiming unknown Nov 23 05:02:35 localhost systemd-udevd[325521]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.442 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87fb399b-8c32-4da7-b979-46b50b5b7dd8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e9b240b4-dda7-48fc-a63a-d3fd91217a97) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.444 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e9b240b4-dda7-48fc-a63a-d3fd91217a97 in datapath 319f7ca3-1c18-4436-8178-bfc17a98eb45 bound to our chassis#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.446 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 319f7ca3-1c18-4436-8178-bfc17a98eb45 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.448 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6a744685-56f5-4bb9-ac6f-f8c6ce202820]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00364|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 ovn-installed in OVS Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00365|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 up in Southbound Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost journal[230249]: ethtool ioctl error on tape9b240b4-dd: No such device Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost dnsmasq[323235]: exiting on receipt of SIGTERM Nov 23 05:02:35 localhost podman[325570]: 2025-11-23 10:02:35.782261373 +0000 UTC m=+0.071215274 container kill 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:35 localhost systemd[1]: libpod-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope: Deactivated successfully. Nov 23 05:02:35 localhost podman[325590]: 2025-11-23 10:02:35.872017918 +0000 UTC m=+0.069576533 container died 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:35 localhost systemd[1]: tmp-crun.y8KVnG.mount: Deactivated successfully. Nov 23 05:02:35 localhost podman[325590]: 2025-11-23 10:02:35.954011635 +0000 UTC m=+0.151570250 container remove 6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e392c0bc-bd43-40a4-a7d7-6e0130e48060, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:02:35 localhost systemd[1]: libpod-conmon-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d.scope: Deactivated successfully. Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.970 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00366|binding|INFO|Releasing lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 from this chassis (sb_readonly=0) Nov 23 05:02:35 localhost kernel: device tap4888f66e-2a left promiscuous mode Nov 23 05:02:35 localhost ovn_controller[154788]: 2025-11-23T10:02:35Z|00367|binding|INFO|Setting lport 4888f66e-2a7b-4114-aa4a-94f38d09c793 down in Southbound Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.986 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e392c0bc-bd43-40a4-a7d7-6e0130e48060', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23ffb5a89d5d4d8a8900ea750309030f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cf4fe09-32ab-44a4-bc92-e7a5f2b64203, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4888f66e-2a7b-4114-aa4a-94f38d09c793) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.990 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4888f66e-2a7b-4114-aa4a-94f38d09c793 in datapath e392c0bc-bd43-40a4-a7d7-6e0130e48060 unbound from our chassis#033[00m Nov 23 05:02:35 localhost nova_compute[281952]: 2025-11-23 10:02:35.992 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.997 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e392c0bc-bd43-40a4-a7d7-6e0130e48060 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:35.998 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[49571d1b-4171-40c2-b031-2967a719d949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:02:36 localhost podman[325629]: 2025-11-23 10:02:36.253179329 +0000 UTC m=+0.057338006 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:02:36 localhost podman[325629]: 2025-11-23 10:02:36.257940276 +0000 UTC m=+0.062099023 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:02:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.261 263258 INFO neutron.agent.dhcp.agent [None req-6508a959-80b6-47ac-9f8f-23d2437b76a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.262 263258 INFO neutron.agent.dhcp.agent [None req-6508a959-80b6-47ac-9f8f-23d2437b76a0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:36 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:02:36 localhost systemd[1]: var-lib-containers-storage-overlay-85794c23db926913628cec6db6b73789ec462daef8098b4cddf986aad56e49c9-merged.mount: Deactivated successfully. Nov 23 05:02:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6986e6cbbf42291e0cc5ec8f9d53f8d21f711b7f8715b0c7eb7c2930396e549d-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:36 localhost systemd[1]: run-netns-qdhcp\x2de392c0bc\x2dbd43\x2d40a4\x2da7d7\x2d6e0130e48060.mount: Deactivated successfully. Nov 23 05:02:36 localhost podman[325630]: 2025-11-23 10:02:36.316702677 +0000 UTC m=+0.119509353 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:02:36 localhost podman[325630]: 2025-11-23 10:02:36.333224806 +0000 UTC m=+0.136031472 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 23 05:02:36 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:02:36 localhost podman[325690]: Nov 23 05:02:36 localhost podman[325690]: 2025-11-23 10:02:36.503363497 +0000 UTC m=+0.078839219 container create 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:02:36 localhost systemd[1]: Started libpod-conmon-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope. Nov 23 05:02:36 localhost systemd[1]: Started libcrun container. Nov 23 05:02:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ed3cccae87a2cec8679f8b781e7106fb3edbfdb329ca120981a36091e9dbff3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:36 localhost podman[325690]: 2025-11-23 10:02:36.463053315 +0000 UTC m=+0.038529057 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:36 localhost podman[325690]: 2025-11-23 10:02:36.565399178 +0000 UTC m=+0.140874890 container init 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:02:36 localhost podman[325690]: 2025-11-23 10:02:36.571367972 +0000 UTC m=+0.146843684 container start 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:02:36 localhost dnsmasq[325708]: started, version 2.85 cachesize 150 Nov 23 05:02:36 localhost dnsmasq[325708]: DNS service limited to local subnets Nov 23 05:02:36 localhost dnsmasq[325708]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:36 localhost dnsmasq[325708]: warning: no upstream servers configured Nov 23 05:02:36 localhost dnsmasq-dhcp[325708]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Nov 23 05:02:36 localhost dnsmasq[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/addn_hosts - 0 addresses Nov 23 05:02:36 localhost dnsmasq-dhcp[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/host Nov 23 05:02:36 localhost dnsmasq-dhcp[325708]: read /var/lib/neutron/dhcp/319f7ca3-1c18-4436-8178-bfc17a98eb45/opts Nov 23 05:02:36 localhost nova_compute[281952]: 2025-11-23 10:02:36.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:36 localhost nova_compute[281952]: 2025-11-23 10:02:36.607 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.735 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:36.809 263258 INFO neutron.agent.dhcp.agent [None req-ee8db2d2-f23d-47f1-be52-dbf00df2cc7a - - - - - -] DHCP configuration for ports {'2d9b3930-b9d4-4c3f-a3e2-b427dd3184a4'} is completed#033[00m Nov 23 05:02:37 localhost ovn_controller[154788]: 2025-11-23T10:02:37Z|00368|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:37 localhost nova_compute[281952]: 2025-11-23 10:02:37.043 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e129 e129: 6 total, 6 up, 6 in Nov 23 05:02:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e130 e130: 6 total, 6 up, 6 in Nov 23 05:02:38 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:38.092 2 INFO neutron.agent.securitygroups_rpc [None req-5e89ea53-7b2c-469d-be9d-51deeecc82b0 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:38 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 23 05:02:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:38 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:38.661 2 INFO neutron.agent.securitygroups_rpc [None req-64ec2b21-8187-4ac4-a9ab-6a7a717b7a77 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e131 e131: 6 total, 6 up, 6 in Nov 23 05:02:40 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:02:40 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:02:40 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:40.892 2 INFO neutron.agent.securitygroups_rpc [None req-abfc9448-e887-481e-8dd9-d14e7060c10b 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:41.111 263258 INFO neutron.agent.linux.ip_lib [None req-3bd5f328-01ff-4a4c-b780-349c7b2100a4 - - - - - -] Device tap020b8cf6-ea cannot be used as it has no MAC address#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost kernel: device tap020b8cf6-ea entered promiscuous mode Nov 23 05:02:41 localhost NetworkManager[5975]: [1763892161.1410] manager: (tap020b8cf6-ea): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Nov 23 05:02:41 localhost systemd-udevd[325805]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00369|binding|INFO|Claiming lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a for this chassis. Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00370|binding|INFO|020b8cf6-ea76-4e97-97d4-9364e1402e7a: Claiming unknown Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.167 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddb78cd-66e7-4087-9a5b-55327ec8df75, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=020b8cf6-ea76-4e97-97d4-9364e1402e7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.168 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 020b8cf6-ea76-4e97-97d4-9364e1402e7a in datapath ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 bound to our chassis#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.168 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.169 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2c695947-0a13-4be2-8014-192000a36946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00371|binding|INFO|Setting lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a ovn-installed in OVS Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00372|binding|INFO|Setting lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a up in Southbound Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.190 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.223 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.247 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00373|binding|INFO|Removing iface tap020b8cf6-ea ovn-installed in OVS Nov 23 05:02:41 localhost ovn_controller[154788]: 2025-11-23T10:02:41Z|00374|binding|INFO|Removing lport 020b8cf6-ea76-4e97-97d4-9364e1402e7a ovn-installed in OVS Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.306 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 26f1c99f-631d-416a-8ce9-39f42a019f9a with type ""#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.307 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eddb78cd-66e7-4087-9a5b-55327ec8df75, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=020b8cf6-ea76-4e97-97d4-9364e1402e7a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.308 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 020b8cf6-ea76-4e97-97d4-9364e1402e7a in datapath ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 unbound from our chassis#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.308 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae5e7bf6-ef74-47f3-9aa4-c47f1574f753 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:41.309 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[793b7a83-74bd-4cd8-a603-e579f64da57c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.351 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.358 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e132 e132: 6 total, 6 up, 6 in Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.594 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost nova_compute[281952]: 2025-11-23 10:02:41.611 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:41.822 2 INFO neutron.agent.securitygroups_rpc [None req-1e146831-4c05-4284-9c5f-ec90349e3dfc 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:41 localhost podman[240668]: time="2025-11-23T10:02:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:02:41 localhost podman[240668]: @ - - [23/Nov/2025:10:02:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157508 "" "Go-http-client/1.1" Nov 23 05:02:42 localhost podman[240668]: @ - - [23/Nov/2025:10:02:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19713 "" "Go-http-client/1.1" Nov 23 05:02:42 localhost podman[325859]: Nov 23 05:02:42 localhost podman[325859]: 2025-11-23 10:02:42.067634568 +0000 UTC m=+0.095059580 container create 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:02:42 localhost systemd[1]: Started libpod-conmon-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope. Nov 23 05:02:42 localhost podman[325859]: 2025-11-23 10:02:42.023353263 +0000 UTC m=+0.050778315 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:42 localhost systemd[1]: tmp-crun.aCbdRt.mount: Deactivated successfully. Nov 23 05:02:42 localhost systemd[1]: Started libcrun container. Nov 23 05:02:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e31e5055d0d197213df95c02fe47a3d397ca129ef887435c02f1f8b6de8338e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:42 localhost podman[325859]: 2025-11-23 10:02:42.160961722 +0000 UTC m=+0.188386734 container init 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:02:42 localhost podman[325859]: 2025-11-23 10:02:42.171991652 +0000 UTC m=+0.199416664 container start 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:42 localhost dnsmasq[325877]: started, version 2.85 cachesize 150 Nov 23 05:02:42 localhost dnsmasq[325877]: DNS service limited to local subnets Nov 23 05:02:42 localhost dnsmasq[325877]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:42 localhost dnsmasq[325877]: warning: no upstream servers configured Nov 23 05:02:42 localhost dnsmasq-dhcp[325877]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:42 localhost dnsmasq[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/addn_hosts - 0 addresses Nov 23 05:02:42 localhost dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/host Nov 23 05:02:42 localhost dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/opts Nov 23 05:02:42 localhost kernel: device tap020b8cf6-ea left promiscuous mode Nov 23 05:02:42 localhost nova_compute[281952]: 2025-11-23 10:02:42.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.269 263258 INFO neutron.agent.dhcp.agent [None req-e50c8df0-d731-49e9-8230-00c704e50710 - - - - - -] DHCP configuration for ports {'85355cdf-63bc-4add-bb48-9440c5028be9'} is completed#033[00m Nov 23 05:02:42 localhost nova_compute[281952]: 2025-11-23 10:02:42.281 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:42 localhost dnsmasq[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/addn_hosts - 0 addresses Nov 23 05:02:42 localhost dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/host Nov 23 05:02:42 localhost dnsmasq-dhcp[325877]: read /var/lib/neutron/dhcp/ae5e7bf6-ef74-47f3-9aa4-c47f1574f753/opts Nov 23 05:02:42 localhost podman[325896]: 2025-11-23 10:02:42.474199472 +0000 UTC m=+0.069052309 container kill 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent [None req-3bd5f328-01ff-4a4c-b780-349c7b2100a4 - - - - - -] Unable to reload_allocations dhcp for ae5e7bf6-ef74-47f3-9aa4-c47f1574f753.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap020b8cf6-ea not found in namespace qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753. Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent return fut.result() Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent raise self._exception Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap020b8cf6-ea not found in namespace qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753. Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.534 263258 ERROR neutron.agent.dhcp.agent #033[00m Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.540 263258 INFO neutron.agent.dhcp.agent [None req-fdec6d1d-4c7e-426c-ac37-18163735a0c1 - - - - - -] Synchronizing state#033[00m Nov 23 05:02:42 localhost ovn_controller[154788]: 2025-11-23T10:02:42Z|00375|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:42 localhost nova_compute[281952]: 2025-11-23 10:02:42.668 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:42.826 263258 INFO neutron.agent.dhcp.agent [None req-de0abd54-a01f-4d36-8e28-7895c77a337f - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 05:02:42 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:42.930 2 INFO neutron.agent.securitygroups_rpc [None req-1cdbc8e1-ca0b-46de-ad57-ff34c29d4e3c fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['7d97459a-8496-4621-8cc6-1521c3f526b4']#033[00m Nov 23 05:02:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e133 e133: 6 total, 6 up, 6 in Nov 23 05:02:43 localhost dnsmasq[325877]: exiting on receipt of SIGTERM Nov 23 05:02:43 localhost podman[325928]: 2025-11-23 10:02:43.039485116 +0000 UTC m=+0.078492119 container kill 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:02:43 localhost systemd[1]: libpod-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope: Deactivated successfully. Nov 23 05:02:43 localhost podman[325941]: 2025-11-23 10:02:43.121925875 +0000 UTC m=+0.069494841 container died 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:02:43 localhost systemd[1]: tmp-crun.lEB8Rn.mount: Deactivated successfully. Nov 23 05:02:43 localhost podman[325941]: 2025-11-23 10:02:43.173555916 +0000 UTC m=+0.121124842 container cleanup 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:02:43 localhost systemd[1]: libpod-conmon-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9.scope: Deactivated successfully. Nov 23 05:02:43 localhost podman[325948]: 2025-11-23 10:02:43.22498416 +0000 UTC m=+0.155300764 container remove 1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae5e7bf6-ef74-47f3-9aa4-c47f1574f753, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:02:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.273 263258 INFO neutron.agent.dhcp.agent [-] Starting network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration#033[00m Nov 23 05:02:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.275 263258 INFO neutron.agent.dhcp.agent [-] Finished network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a dhcp configuration#033[00m Nov 23 05:02:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:43.275 263258 INFO neutron.agent.dhcp.agent [None req-54274fa7-429b-40de-9645-e5a1e1ac12d0 - - - - - -] Synchronizing state complete#033[00m Nov 23 05:02:43 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:43.345 2 INFO neutron.agent.securitygroups_rpc [None req-9fd2c3cf-c790-41eb-bd9f-c36dc00051cd fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['7d97459a-8496-4621-8cc6-1521c3f526b4']#033[00m Nov 23 05:02:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:02:44 localhost systemd[1]: var-lib-containers-storage-overlay-e31e5055d0d197213df95c02fe47a3d397ca129ef887435c02f1f8b6de8338e2-merged.mount: Deactivated successfully. Nov 23 05:02:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e0954e448b71772913c546af59159e7cfd34df680bcf59d062181359f8e8db9-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:44 localhost systemd[1]: run-netns-qdhcp\x2dae5e7bf6\x2def74\x2d47f3\x2d9aa4\x2dc47f1574f753.mount: Deactivated successfully. Nov 23 05:02:44 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:44.917 2 INFO neutron.agent.securitygroups_rpc [None req-c88be53d-2b25-4bbf-93ce-69a72dd56cf0 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:45 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:45.226 2 INFO neutron.agent.securitygroups_rpc [None req-3637ef8e-b231-4ebc-b0ad-62e8972d9361 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:45.386 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:45.386 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:02:45 localhost nova_compute[281952]: 2025-11-23 10:02:45.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:02:46 localhost systemd[1]: tmp-crun.XuoVkk.mount: Deactivated successfully. Nov 23 05:02:46 localhost podman[325966]: 2025-11-23 10:02:46.058618371 +0000 UTC m=+0.110486105 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 05:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:02:46 localhost podman[325966]: 2025-11-23 10:02:46.10044709 +0000 UTC m=+0.152314834 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 05:02:46 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:02:46 localhost systemd[1]: tmp-crun.ukJ7ks.mount: Deactivated successfully. Nov 23 05:02:46 localhost podman[325984]: 2025-11-23 10:02:46.170696764 +0000 UTC m=+0.085557507 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 23 05:02:46 localhost podman[325984]: 2025-11-23 10:02:46.176443071 +0000 UTC m=+0.091303844 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 05:02:46 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:02:46 localhost podman[325983]: 2025-11-23 10:02:46.239423841 +0000 UTC m=+0.159645989 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 05:02:46 localhost podman[325983]: 2025-11-23 10:02:46.287311586 +0000 UTC m=+0.207533674 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:02:46 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:02:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:46.342 263258 INFO neutron.agent.linux.ip_lib [None req-4bbb4871-c239-4103-a3fc-3f5ce119039b - - - - - -] Device tap7f7bd95f-df cannot be used as it has no MAC address#033[00m Nov 23 05:02:46 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:46.346 2 INFO neutron.agent.securitygroups_rpc [None req-c48e4a17-f990-4551-a604-7a1f5baad78f fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.403 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost kernel: device tap7f7bd95f-df entered promiscuous mode Nov 23 05:02:46 localhost NetworkManager[5975]: [1763892166.4092] manager: (tap7f7bd95f-df): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.409 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost systemd-udevd[326031]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:46 localhost ovn_controller[154788]: 2025-11-23T10:02:46Z|00376|binding|INFO|Claiming lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad for this chassis. Nov 23 05:02:46 localhost ovn_controller[154788]: 2025-11-23T10:02:46Z|00377|binding|INFO|7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad: Claiming unknown Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.443 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost ovn_controller[154788]: 2025-11-23T10:02:46Z|00378|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad ovn-installed in OVS Nov 23 05:02:46 localhost ovn_controller[154788]: 2025-11-23T10:02:46Z|00379|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.469 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost ovn_controller[154788]: 2025-11-23T10:02:46Z|00380|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad up in Southbound Nov 23 05:02:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:46.483 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=983e35fe-65e3-4c0c-9ba9-9421db3b9faf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:46.484 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad in datapath 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 bound to our chassis#033[00m Nov 23 05:02:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:46.486 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:46.487 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a42538-08ad-440a-941f-5e911d21eca9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.595 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost nova_compute[281952]: 2025-11-23 10:02:46.613 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:46 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:46.897 2 INFO neutron.agent.securitygroups_rpc [None req-69aab5f7-414f-4919-8a4c-b4fcd82ff545 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:46 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:46.953 2 INFO neutron.agent.securitygroups_rpc [None req-19349ae1-ccba-4615-bdb3-c092a3aa234c 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:47 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:47.192 2 INFO neutron.agent.securitygroups_rpc [None req-a4966cd3-a78e-4723-9b2c-0105f97058c4 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:47 localhost podman[326086]: Nov 23 05:02:47 localhost podman[326086]: 2025-11-23 10:02:47.324730155 +0000 UTC m=+0.088255851 container create 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:02:47 localhost systemd[1]: Started libpod-conmon-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope. Nov 23 05:02:47 localhost podman[326086]: 2025-11-23 10:02:47.277929133 +0000 UTC m=+0.041454829 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:47 localhost systemd[1]: Started libcrun container. Nov 23 05:02:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c3fea95cfcf9b4849df2e2cc958865fbf3b563efeef0cb306d35e7e353e4421/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:47 localhost podman[326086]: 2025-11-23 10:02:47.396760663 +0000 UTC m=+0.160286319 container init 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:02:47 localhost podman[326086]: 2025-11-23 10:02:47.40314853 +0000 UTC m=+0.166674186 container start 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:47 localhost dnsmasq[326102]: started, version 2.85 cachesize 150 Nov 23 05:02:47 localhost dnsmasq[326102]: DNS service limited to local subnets Nov 23 05:02:47 localhost dnsmasq[326102]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:47 localhost dnsmasq[326102]: warning: no upstream servers configured Nov 23 05:02:47 localhost dnsmasq-dhcp[326102]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:47 localhost dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 0 addresses Nov 23 05:02:47 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host Nov 23 05:02:47 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts Nov 23 05:02:47 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.451 263258 INFO neutron.agent.dhcp.agent [None req-4bbb4871-c239-4103-a3fc-3f5ce119039b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c964b4e6-771f-468b-9ae9-3eb34c34ed00, ip_allocation=immediate, mac_address=fa:16:3e:5c:5b:d7, name=tempest-PortsIpV6TestJSON-1450374320, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:43Z, description=, dns_domain=, id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-73862936, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['8a70a731-5cef-4632-a106-5587ddd28431'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:45Z, vlan_transparent=None, network_id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=2012, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:46Z on network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4#033[00m Nov 23 05:02:47 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:47.555 2 INFO neutron.agent.securitygroups_rpc [None req-e2113fc4-8297-4e83-8bfc-91b8d2f12d0b fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:47 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.566 263258 INFO neutron.agent.dhcp.agent [None req-dfc703c3-43ca-4430-b00e-1537333d7d9f - - - - - -] DHCP configuration for ports {'dcb77374-8f77-4c3a-9585-ee35bed6f389'} is completed#033[00m Nov 23 05:02:47 localhost dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 1 addresses Nov 23 05:02:47 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host Nov 23 05:02:47 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts Nov 23 05:02:47 localhost podman[326118]: 2025-11-23 10:02:47.621986982 +0000 UTC m=+0.043273415 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:47 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:47.954 263258 INFO neutron.agent.dhcp.agent [None req-0d9872b0-fd4d-4d03-9556-59f812560fcd - - - - - -] DHCP configuration for ports {'c964b4e6-771f-468b-9ae9-3eb34c34ed00'} is completed#033[00m Nov 23 05:02:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 e134: 6 total, 6 up, 6 in Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.984827) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167984972, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1210, "num_deletes": 259, "total_data_size": 1566141, "memory_usage": 1591808, "flush_reason": "Manual Compaction"} Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167992229, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 813419, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24308, "largest_seqno": 25513, "table_properties": {"data_size": 808882, "index_size": 2072, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11666, "raw_average_key_size": 21, "raw_value_size": 799296, "raw_average_value_size": 1482, "num_data_blocks": 90, "num_entries": 539, "num_filter_entries": 539, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892107, "oldest_key_time": 1763892107, "file_creation_time": 1763892167, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 7458 microseconds, and 3491 cpu microseconds. Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.992282) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 813419 bytes OK Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.992304) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995277) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995299) EVENT_LOG_v1 {"time_micros": 1763892167995292, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995320) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1560160, prev total WAL file size 1560160, number of live WAL files 2. Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.996013) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303037' seq:72057594037927935, type:22 .. '6D6772737461740034323539' seq:0, type:0; will stop at (end) Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(794KB)], [42(15MB)] Nov 23 05:02:47 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892167996070, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16843888, "oldest_snapshot_seqno": -1} Nov 23 05:02:48 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:48.008 2 INFO neutron.agent.securitygroups_rpc [None req-11137031-a253-41b4-b5b7-0c84953c1da3 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12451 keys, 14927692 bytes, temperature: kUnknown Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168091411, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 14927692, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14860152, "index_size": 35311, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31173, "raw_key_size": 337081, "raw_average_key_size": 27, "raw_value_size": 14651174, "raw_average_value_size": 1176, "num_data_blocks": 1309, "num_entries": 12451, "num_filter_entries": 12451, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892167, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.091758) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 14927692 bytes Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.093979) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.4 rd, 156.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 15.3 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(39.1) write-amplify(18.4) OK, records in: 12958, records dropped: 507 output_compression: NoCompression Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.094007) EVENT_LOG_v1 {"time_micros": 1763892168093995, "job": 24, "event": "compaction_finished", "compaction_time_micros": 95460, "compaction_time_cpu_micros": 45948, "output_level": 6, "num_output_files": 1, "total_output_size": 14927692, "num_input_records": 12958, "num_output_records": 12451, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168094234, "job": 24, "event": "table_file_deletion", "file_number": 44} Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892168096578, "job": 24, "event": "table_file_deletion", "file_number": 42} Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:47.995870) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096710) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096725) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:02:48.096727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:02:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:48.388 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:02:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:48 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:48.641 2 INFO neutron.agent.securitygroups_rpc [None req-edba85f2-3d5d-4fd2-9504-24d71c7cf655 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:49.209 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:46Z, description=, device_id=629023cb-7ed5-4aab-8f47-b87316f2c0e8, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c964b4e6-771f-468b-9ae9-3eb34c34ed00, ip_allocation=immediate, mac_address=fa:16:3e:5c:5b:d7, name=tempest-PortsIpV6TestJSON-1450374320, network_id=5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['d7ead8f7-80d5-4103-ab91-19b87956485a'], standard_attr_id=2012, status=ACTIVE, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:47Z on network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4#033[00m Nov 23 05:02:49 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:49.211 2 INFO neutron.agent.securitygroups_rpc [None req-d908f8b2-2302-4d45-805c-4d35b6de9b08 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:49 localhost dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 1 addresses Nov 23 05:02:49 localhost podman[326157]: 2025-11-23 10:02:49.405300198 +0000 UTC m=+0.060225336 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:02:49 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host Nov 23 05:02:49 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts Nov 23 05:02:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:49.636 263258 INFO neutron.agent.dhcp.agent [None req-ed607fe6-b388-435a-a837-eee2463de4b1 - - - - - -] DHCP configuration for ports {'c964b4e6-771f-468b-9ae9-3eb34c34ed00'} is completed#033[00m Nov 23 05:02:49 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:49.955 2 INFO neutron.agent.securitygroups_rpc [None req-765ea533-9294-4729-95dc-8b0b8371a481 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:02:50 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:50.135 2 INFO neutron.agent.securitygroups_rpc [None req-838089d2-b106-4cd8-a2d9-7d5c71329675 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['ff801f98-3c32-488d-a0ec-adbb05a31d18']#033[00m Nov 23 05:02:50 localhost dnsmasq[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/addn_hosts - 0 addresses Nov 23 05:02:50 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/host Nov 23 05:02:50 localhost dnsmasq-dhcp[326102]: read /var/lib/neutron/dhcp/5facb45a-bb59-481c-b7e6-dbeb21aaf8b4/opts Nov 23 05:02:50 localhost podman[326194]: 2025-11-23 10:02:50.164676531 +0000 UTC m=+0.057047328 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:02:50 localhost ovn_controller[154788]: 2025-11-23T10:02:50Z|00381|binding|INFO|Releasing lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad from this chassis (sb_readonly=0) Nov 23 05:02:50 localhost kernel: device tap7f7bd95f-df left promiscuous mode Nov 23 05:02:50 localhost nova_compute[281952]: 2025-11-23 10:02:50.349 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:50 localhost ovn_controller[154788]: 2025-11-23T10:02:50Z|00382|binding|INFO|Setting lport 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad down in Southbound Nov 23 05:02:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:50.362 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=983e35fe-65e3-4c0c-9ba9-9421db3b9faf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:50.363 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7f7bd95f-dfd2-4d83-89d3-e2ffcae09bad in datapath 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 unbound from our chassis#033[00m Nov 23 05:02:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:50.365 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5facb45a-bb59-481c-b7e6-dbeb21aaf8b4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:50.369 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[0fdf78eb-f3c4-4913-998e-4b587a7991b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:50 localhost nova_compute[281952]: 2025-11-23 10:02:50.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:51 localhost dnsmasq[326102]: exiting on receipt of SIGTERM Nov 23 05:02:51 localhost systemd[1]: libpod-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope: Deactivated successfully. Nov 23 05:02:51 localhost podman[326233]: 2025-11-23 10:02:51.495697944 +0000 UTC m=+0.053481049 container kill 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:51 localhost podman[326247]: 2025-11-23 10:02:51.548711697 +0000 UTC m=+0.039930841 container died 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 05:02:51 localhost systemd[1]: tmp-crun.W8YJHW.mount: Deactivated successfully. Nov 23 05:02:51 localhost nova_compute[281952]: 2025-11-23 10:02:51.640 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:51 localhost podman[326247]: 2025-11-23 10:02:51.643728604 +0000 UTC m=+0.134947728 container cleanup 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:02:51 localhost systemd[1]: libpod-conmon-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f.scope: Deactivated successfully. Nov 23 05:02:51 localhost podman[326249]: 2025-11-23 10:02:51.707803458 +0000 UTC m=+0.193174682 container remove 37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5facb45a-bb59-481c-b7e6-dbeb21aaf8b4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:02:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:51.734 263258 INFO neutron.agent.dhcp.agent [None req-2a991f53-4758-4e67-afd2-53a3ed94172e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:51.736 263258 INFO neutron.agent.dhcp.agent [None req-2a991f53-4758-4e67-afd2-53a3ed94172e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:51 localhost ovn_controller[154788]: 2025-11-23T10:02:51Z|00383|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:51 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:51.859 2 INFO neutron.agent.securitygroups_rpc [None req-8ada9fea-3f3e-487c-9d7c-c67e94466d69 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2de10e3b-e1e6-47ac-8eeb-13eb3642fef8']#033[00m Nov 23 05:02:51 localhost nova_compute[281952]: 2025-11-23 10:02:51.871 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:52 localhost systemd[1]: var-lib-containers-storage-overlay-5c3fea95cfcf9b4849df2e2cc958865fbf3b563efeef0cb306d35e7e353e4421-merged.mount: Deactivated successfully. Nov 23 05:02:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37b0a6b86ad649967fc69c40c5d099f79bd6e18b1029397a537c3351b24d160f-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:52 localhost systemd[1]: run-netns-qdhcp\x2d5facb45a\x2dbb59\x2d481c\x2db7e6\x2ddbeb21aaf8b4.mount: Deactivated successfully. Nov 23 05:02:53 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:53.223 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:53 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:53.927 263258 INFO neutron.agent.linux.ip_lib [None req-79818a7e-6681-46d6-95ae-9b562351bf83 - - - - - -] Device tap924d747d-30 cannot be used as it has no MAC address#033[00m Nov 23 05:02:53 localhost nova_compute[281952]: 2025-11-23 10:02:53.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:53 localhost kernel: device tap924d747d-30 entered promiscuous mode Nov 23 05:02:53 localhost NetworkManager[5975]: [1763892173.9811] manager: (tap924d747d-30): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Nov 23 05:02:53 localhost ovn_controller[154788]: 2025-11-23T10:02:53Z|00384|binding|INFO|Claiming lport 924d747d-3069-493d-890d-d22289f6cb63 for this chassis. Nov 23 05:02:53 localhost ovn_controller[154788]: 2025-11-23T10:02:53Z|00385|binding|INFO|924d747d-3069-493d-890d-d22289f6cb63: Claiming unknown Nov 23 05:02:53 localhost nova_compute[281952]: 2025-11-23 10:02:53.982 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:53 localhost systemd-udevd[326286]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:53.993 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c46eb8d9-f4c4-4e34-a4e0-3a70a48cb8ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=924d747d-3069-493d-890d-d22289f6cb63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:53.995 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 924d747d-3069-493d-890d-d22289f6cb63 in datapath 0d2bb8b4-9b3e-41c7-b595-54664cfb433a bound to our chassis#033[00m Nov 23 05:02:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:53.996 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:53.997 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f07d5b66-bcdc-4172-8630-51d1052c3365]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e135 e135: 6 total, 6 up, 6 in Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.012 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00386|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 ovn-installed in OVS Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00387|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 up in Southbound Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.017 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost journal[230249]: ethtool ioctl error on tap924d747d-30: No such device Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.057 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:54.181 263258 INFO neutron.agent.linux.ip_lib [None req-5eec097b-deb4-454c-8c91-216fc47067b4 - - - - - -] Device tap6f155903-a3 cannot be used as it has no MAC address#033[00m Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.203 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost kernel: device tap6f155903-a3 entered promiscuous mode Nov 23 05:02:54 localhost NetworkManager[5975]: [1763892174.2085] manager: (tap6f155903-a3): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00388|binding|INFO|Claiming lport 6f155903-a394-40bc-9c4e-04010e974788 for this chassis. Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00389|binding|INFO|6f155903-a394-40bc-9c4e-04010e974788: Claiming unknown Nov 23 05:02:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:54.227 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d782672f-ba9a-4b1f-9286-2b53b24a21c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6f155903-a394-40bc-9c4e-04010e974788) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:54.228 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6f155903-a394-40bc-9c4e-04010e974788 in datapath 725f5f75-c3ef-4a36-ba95-e1cd3131878c bound to our chassis#033[00m Nov 23 05:02:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:54.229 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 725f5f75-c3ef-4a36-ba95-e1cd3131878c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:54 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:54.229 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[51e82432-45aa-478f-b12f-4e40b93fcaea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00390|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 ovn-installed in OVS Nov 23 05:02:54 localhost ovn_controller[154788]: 2025-11-23T10:02:54Z|00391|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 up in Southbound Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.260 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.307 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:54.314 2 INFO neutron.agent.securitygroups_rpc [None req-70f0f8e2-9250-4661-ae6f-f58e5f669345 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['dc864c5c-de53-475b-960e-083ffe4e3e6b']#033[00m Nov 23 05:02:54 localhost nova_compute[281952]: 2025-11-23 10:02:54.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:54 localhost podman[326398]: Nov 23 05:02:54 localhost podman[326398]: 2025-11-23 10:02:54.98521649 +0000 UTC m=+0.085532405 container create 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:02:55 localhost systemd[1]: Started libpod-conmon-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope. Nov 23 05:02:55 localhost podman[326398]: 2025-11-23 10:02:54.937557092 +0000 UTC m=+0.037873017 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:55 localhost systemd[1]: Started libcrun container. Nov 23 05:02:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6e4c512250f0e320013f079dacd19dfb69dd5096921c83fb30670bcf3ae1a91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:55 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:55.062 2 INFO neutron.agent.securitygroups_rpc [None req-ebacb72f-43a4-4e7f-beed-d94b86f532ab fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['dc864c5c-de53-475b-960e-083ffe4e3e6b']#033[00m Nov 23 05:02:55 localhost podman[326398]: 2025-11-23 10:02:55.109288732 +0000 UTC m=+0.209604607 container init 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:02:55 localhost podman[326398]: 2025-11-23 10:02:55.119092024 +0000 UTC m=+0.219407919 container start 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:02:55 localhost dnsmasq[326434]: started, version 2.85 cachesize 150 Nov 23 05:02:55 localhost dnsmasq[326434]: DNS service limited to local subnets Nov 23 05:02:55 localhost dnsmasq[326434]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:55 localhost dnsmasq[326434]: warning: no upstream servers configured Nov 23 05:02:55 localhost dnsmasq-dhcp[326434]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:55 localhost dnsmasq[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:02:55 localhost dnsmasq-dhcp[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:02:55 localhost dnsmasq-dhcp[326434]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:02:55 localhost dnsmasq[326434]: exiting on receipt of SIGTERM Nov 23 05:02:55 localhost systemd[1]: libpod-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope: Deactivated successfully. Nov 23 05:02:55 localhost podman[326441]: 2025-11-23 10:02:55.219640692 +0000 UTC m=+0.069508752 container died 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:02:55 localhost podman[326441]: 2025-11-23 10:02:55.247019926 +0000 UTC m=+0.096887976 container cleanup 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:02:55 localhost podman[326454]: 2025-11-23 10:02:55.267471405 +0000 UTC m=+0.046684939 container cleanup 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 05:02:55 localhost systemd[1]: libpod-conmon-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495.scope: Deactivated successfully. Nov 23 05:02:55 localhost podman[326469]: 2025-11-23 10:02:55.339641799 +0000 UTC m=+0.075643251 container remove 0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.395 263258 INFO neutron.agent.dhcp.agent [None req-d5e2aec0-a188-4402-9b7e-a9bf1d6a375a - - - - - -] DHCP configuration for ports {'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:02:55 localhost podman[326485]: Nov 23 05:02:55 localhost podman[326485]: 2025-11-23 10:02:55.435081489 +0000 UTC m=+0.078369315 container create 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:02:55 localhost systemd[1]: Started libpod-conmon-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope. Nov 23 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:02:55 localhost systemd[1]: Started libcrun container. Nov 23 05:02:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b5117719bced5fdac21fc3bb297cfd580c490322e10e374d90cdd191e13b6d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:55 localhost podman[326485]: 2025-11-23 10:02:55.401618468 +0000 UTC m=+0.044906284 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:55 localhost podman[326485]: 2025-11-23 10:02:55.512600606 +0000 UTC m=+0.155888422 container init 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 05:02:55 localhost podman[326485]: 2025-11-23 10:02:55.521727218 +0000 UTC m=+0.165015034 container start 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:02:55 localhost dnsmasq[326522]: started, version 2.85 cachesize 150 Nov 23 05:02:55 localhost dnsmasq[326522]: DNS service limited to local subnets Nov 23 05:02:55 localhost dnsmasq[326522]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:55 localhost dnsmasq[326522]: warning: no upstream servers configured Nov 23 05:02:55 localhost dnsmasq-dhcp[326522]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:55 localhost dnsmasq[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/addn_hosts - 0 addresses Nov 23 05:02:55 localhost dnsmasq-dhcp[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/host Nov 23 05:02:55 localhost dnsmasq-dhcp[326522]: read /var/lib/neutron/dhcp/725f5f75-c3ef-4a36-ba95-e1cd3131878c/opts Nov 23 05:02:55 localhost podman[326502]: 2025-11-23 10:02:55.59096827 +0000 UTC m=+0.094787441 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:02:55 localhost podman[326502]: 2025-11-23 10:02:55.59874294 +0000 UTC m=+0.102562071 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:02:55 localhost podman[326501]: 2025-11-23 10:02:55.631653894 +0000 UTC m=+0.139479678 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:02:55 localhost podman[326501]: 2025-11-23 10:02:55.642062535 +0000 UTC m=+0.149888329 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:02:55 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:02:55 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:02:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.743 263258 INFO neutron.agent.dhcp.agent [None req-3a164df1-a840-4abe-9000-4148e2e6778d - - - - - -] DHCP configuration for ports {'02b7bafb-bf4b-46dd-b7b7-250c2ecb1918'} is completed#033[00m Nov 23 05:02:55 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:55.749 2 INFO neutron.agent.securitygroups_rpc [None req-bb083733-91fb-4356-a91b-0dce7c35cb96 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d9259f0b-7c30-4dce-b81e-e0f698e442c7']#033[00m Nov 23 05:02:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:55.802 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5f5b1b33-9dfa-4cc9-8666-90efeaefde4c, ip_allocation=immediate, mac_address=fa:16:3e:8c:3c:b5, name=tempest-PortsIpV6TestJSON-439196546, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:03Z, description=, dns_domain=, id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-274315444, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8521, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['c0c35879-4b70-4336-bd77-6c177763c3a9'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:52Z, vlan_transparent=None, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d9259f0b-7c30-4dce-b81e-e0f698e442c7'], standard_attr_id=2074, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:55Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a#033[00m Nov 23 05:02:55 localhost systemd[1]: tmp-crun.BYAoSg.mount: Deactivated successfully. Nov 23 05:02:55 localhost systemd[1]: var-lib-containers-storage-overlay-c6e4c512250f0e320013f079dacd19dfb69dd5096921c83fb30670bcf3ae1a91-merged.mount: Deactivated successfully. Nov 23 05:02:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f9d51c91e6e5048463eced161988ba893ff99511e1aacce49796b709c4ff495-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e136 e136: 6 total, 6 up, 6 in Nov 23 05:02:56 localhost podman[326569]: Nov 23 05:02:56 localhost podman[326569]: 2025-11-23 10:02:56.354839892 +0000 UTC m=+0.089167638 container create 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 23 05:02:56 localhost systemd[1]: Started libpod-conmon-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope. Nov 23 05:02:56 localhost podman[326569]: 2025-11-23 10:02:56.314653745 +0000 UTC m=+0.048981521 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:56 localhost systemd[1]: Started libcrun container. Nov 23 05:02:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09f567e55101099cb00d6486a716f6879b35e236f2b8a7ee5f08c552a21d3b24/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:56 localhost podman[326569]: 2025-11-23 10:02:56.438631173 +0000 UTC m=+0.172958919 container init 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:02:56 localhost podman[326569]: 2025-11-23 10:02:56.447594829 +0000 UTC m=+0.181922575 container start 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:56 localhost dnsmasq[326587]: started, version 2.85 cachesize 150 Nov 23 05:02:56 localhost dnsmasq[326587]: DNS service limited to local subnets Nov 23 05:02:56 localhost dnsmasq[326587]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:56 localhost dnsmasq[326587]: warning: no upstream servers configured Nov 23 05:02:56 localhost dnsmasq-dhcp[326587]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:56 localhost dnsmasq[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:02:56 localhost dnsmasq-dhcp[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:02:56 localhost dnsmasq-dhcp[326587]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:02:56 localhost sshd[326588]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:02:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:56.621 2 INFO neutron.agent.securitygroups_rpc [None req-79d36424-581f-4db8-8c43-da9cef64debb fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2b80d24e-7954-4669-8041-3d535b2f9be2']#033[00m Nov 23 05:02:56 localhost nova_compute[281952]: 2025-11-23 10:02:56.682 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:56.761 2 INFO neutron.agent.securitygroups_rpc [None req-aea3005f-d734-4d7a-a8c4-fa6242ccbee5 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['2b80d24e-7954-4669-8041-3d535b2f9be2']#033[00m Nov 23 05:02:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:56.830 263258 INFO neutron.agent.dhcp.agent [None req-1778e3b2-e5ea-4ffa-b427-fb26a234b807 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c'} is completed#033[00m Nov 23 05:02:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:57.330 263258 INFO neutron.agent.linux.ip_lib [None req-ff26edd0-4855-4b74-b7cf-72c484335e81 - - - - - -] Device tapa3af0bf2-76 cannot be used as it has no MAC address#033[00m Nov 23 05:02:57 localhost nova_compute[281952]: 2025-11-23 10:02:57.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:57 localhost kernel: device tapa3af0bf2-76 entered promiscuous mode Nov 23 05:02:57 localhost NetworkManager[5975]: [1763892177.3620] manager: (tapa3af0bf2-76): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Nov 23 05:02:57 localhost nova_compute[281952]: 2025-11-23 10:02:57.365 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:57 localhost ovn_controller[154788]: 2025-11-23T10:02:57Z|00392|binding|INFO|Claiming lport a3af0bf2-7636-468b-88ec-a6aa42638a50 for this chassis. Nov 23 05:02:57 localhost ovn_controller[154788]: 2025-11-23T10:02:57Z|00393|binding|INFO|a3af0bf2-7636-468b-88ec-a6aa42638a50: Claiming unknown Nov 23 05:02:57 localhost systemd-udevd[326635]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:02:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:57.376 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08aa4c09-13b3-4d24-b03c-d932cc7d0bac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a3af0bf2-7636-468b-88ec-a6aa42638a50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:57.377 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a3af0bf2-7636-468b-88ec-a6aa42638a50 in datapath 3f022321-7f06-4d92-8d47-80c086661f24 bound to our chassis#033[00m Nov 23 05:02:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:57.377 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f022321-7f06-4d92-8d47-80c086661f24 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:57.378 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9c50b0a9-3b45-48f1-b9e5-c42dcd113dc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:57 localhost ovn_controller[154788]: 2025-11-23T10:02:57Z|00394|binding|INFO|Setting lport a3af0bf2-7636-468b-88ec-a6aa42638a50 ovn-installed in OVS Nov 23 05:02:57 localhost ovn_controller[154788]: 2025-11-23T10:02:57Z|00395|binding|INFO|Setting lport a3af0bf2-7636-468b-88ec-a6aa42638a50 up in Southbound Nov 23 05:02:57 localhost nova_compute[281952]: 2025-11-23 10:02:57.417 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:57 localhost systemd[1]: tmp-crun.QoC8l1.mount: Deactivated successfully. Nov 23 05:02:57 localhost dnsmasq[326587]: exiting on receipt of SIGTERM Nov 23 05:02:57 localhost podman[326621]: 2025-11-23 10:02:57.429060224 +0000 UTC m=+0.094467621 container kill 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:02:57 localhost systemd[1]: libpod-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope: Deactivated successfully. Nov 23 05:02:57 localhost nova_compute[281952]: 2025-11-23 10:02:57.462 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:57 localhost podman[326647]: 2025-11-23 10:02:57.53339065 +0000 UTC m=+0.085324722 container died 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:02:57 localhost nova_compute[281952]: 2025-11-23 10:02:57.538 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:57 localhost podman[326647]: 2025-11-23 10:02:57.584021518 +0000 UTC m=+0.135955580 container remove 2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:02:57 localhost systemd[1]: libpod-conmon-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b.scope: Deactivated successfully. Nov 23 05:02:57 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e137 e137: 6 total, 6 up, 6 in Nov 23 05:02:57 localhost systemd[1]: var-lib-containers-storage-overlay-09f567e55101099cb00d6486a716f6879b35e236f2b8a7ee5f08c552a21d3b24-merged.mount: Deactivated successfully. Nov 23 05:02:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2bce0ced24a7d91c22e9715133f665034fafa8f555da004b9e045614998f5d5b-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:02:58 localhost podman[326744]: Nov 23 05:02:58 localhost podman[326744]: 2025-11-23 10:02:58.48847234 +0000 UTC m=+0.083974768 container create 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:58 localhost systemd[1]: Started libpod-conmon-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope. Nov 23 05:02:58 localhost podman[326744]: 2025-11-23 10:02:58.443789273 +0000 UTC m=+0.039291721 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:58 localhost systemd[1]: Started libcrun container. Nov 23 05:02:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faeca6c7b92d4743d8ee7c6a15ec1a553cb548aadde1156e887861619142f300/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:58 localhost podman[326744]: 2025-11-23 10:02:58.568315669 +0000 UTC m=+0.163818087 container init 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:02:58 localhost podman[326744]: 2025-11-23 10:02:58.582334431 +0000 UTC m=+0.177836849 container start 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:02:58 localhost dnsmasq[326765]: started, version 2.85 cachesize 150 Nov 23 05:02:58 localhost dnsmasq[326765]: DNS service limited to local subnets Nov 23 05:02:58 localhost dnsmasq[326765]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:58 localhost dnsmasq[326765]: warning: no upstream servers configured Nov 23 05:02:58 localhost dnsmasq-dhcp[326765]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:58 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:58.588 2 INFO neutron.agent.securitygroups_rpc [None req-f9017110-ec89-4835-91db-9aa9b814a12e 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d9259f0b-7c30-4dce-b81e-e0f698e442c7', '469976a2-fa36-45e6-842e-95bc93db1438']#033[00m Nov 23 05:02:58 localhost dnsmasq[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/addn_hosts - 0 addresses Nov 23 05:02:58 localhost dnsmasq-dhcp[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/host Nov 23 05:02:58 localhost dnsmasq-dhcp[326765]: read /var/lib/neutron/dhcp/3f022321-7f06-4d92-8d47-80c086661f24/opts Nov 23 05:02:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:58.756 263258 INFO neutron.agent.dhcp.agent [None req-aeaa0fdd-97d9-4ed2-922c-af644ee6c37c - - - - - -] DHCP configuration for ports {'93ccaa6b-1471-4b5e-9c29-91c9cd633700'} is completed#033[00m Nov 23 05:02:58 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:58.768 2 INFO neutron.agent.securitygroups_rpc [None req-17703c58-f506-4a75-8387-af7c0c3c8d74 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:02:58 localhost podman[326797]: Nov 23 05:02:58 localhost dnsmasq[326765]: exiting on receipt of SIGTERM Nov 23 05:02:58 localhost podman[326816]: 2025-11-23 10:02:58.903201706 +0000 UTC m=+0.066953534 container kill 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:02:58 localhost systemd[1]: libpod-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope: Deactivated successfully. Nov 23 05:02:58 localhost podman[326797]: 2025-11-23 10:02:58.946990994 +0000 UTC m=+0.174215137 container create c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:58 localhost podman[326797]: 2025-11-23 10:02:58.864080161 +0000 UTC m=+0.091304334 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:02:58 localhost podman[326837]: 2025-11-23 10:02:58.976839464 +0000 UTC m=+0.052601012 container died 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:02:58 localhost systemd[1]: Started libpod-conmon-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope. Nov 23 05:02:58 localhost systemd[1]: tmp-crun.5EqRcj.mount: Deactivated successfully. Nov 23 05:02:59 localhost systemd[1]: Started libcrun container. Nov 23 05:02:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecece6215fcae68135ad77444848607179a7db1f65cb11807dd870c2d1e6781b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:02:59 localhost podman[326797]: 2025-11-23 10:02:59.025432291 +0000 UTC m=+0.252656444 container init c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:02:59 localhost systemd[1]: tmp-crun.woFOec.mount: Deactivated successfully. Nov 23 05:02:59 localhost podman[326797]: 2025-11-23 10:02:59.042199807 +0000 UTC m=+0.269423950 container start c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:02:59 localhost dnsmasq[326863]: started, version 2.85 cachesize 150 Nov 23 05:02:59 localhost dnsmasq[326863]: DNS service limited to local subnets Nov 23 05:02:59 localhost dnsmasq[326863]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:02:59 localhost dnsmasq[326863]: warning: no upstream servers configured Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:02:59 localhost dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:02:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2-userdata-shm.mount: Deactivated successfully. Nov 23 05:02:59 localhost podman[326837]: 2025-11-23 10:02:59.082413617 +0000 UTC m=+0.158175155 container remove 9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f022321-7f06-4d92-8d47-80c086661f24, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:02:59 localhost systemd[1]: libpod-conmon-9d1c7fc765411f8fd9041639c190f6d78f17796e8839abb8290318caf4fc34d2.scope: Deactivated successfully. Nov 23 05:02:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.091 263258 INFO neutron.agent.dhcp.agent [None req-fe5ae86c-31e0-4af7-9f5f-9f08814d9ce1 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5f5b1b33-9dfa-4cc9-8666-90efeaefde4c, ip_allocation=immediate, mac_address=fa:16:3e:8c:3c:b5, name=tempest-PortsIpV6TestJSON-1148844861, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['469976a2-fa36-45e6-842e-95bc93db1438'], standard_attr_id=2074, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:02:58Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a#033[00m Nov 23 05:02:59 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:59.122 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 666e00e5-826b-452a-afb5-82b9c58af16a with type ""#033[00m Nov 23 05:02:59 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:59.123 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f022321-7f06-4d92-8d47-80c086661f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08aa4c09-13b3-4d24-b03c-d932cc7d0bac, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a3af0bf2-7636-468b-88ec-a6aa42638a50) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:02:59 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:59.125 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a3af0bf2-7636-468b-88ec-a6aa42638a50 in datapath 3f022321-7f06-4d92-8d47-80c086661f24 unbound from our chassis#033[00m Nov 23 05:02:59 localhost ovn_controller[154788]: 2025-11-23T10:02:59Z|00396|binding|INFO|Removing iface tapa3af0bf2-76 ovn-installed in OVS Nov 23 05:02:59 localhost ovn_controller[154788]: 2025-11-23T10:02:59Z|00397|binding|INFO|Removing lport a3af0bf2-7636-468b-88ec-a6aa42638a50 ovn-installed in OVS Nov 23 05:02:59 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:59.125 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f022321-7f06-4d92-8d47-80c086661f24 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:02:59 localhost ovn_metadata_agent[160434]: 2025-11-23 10:02:59.126 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3816a667-0639-42fa-a9c6-7852d39477f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:02:59 localhost nova_compute[281952]: 2025-11-23 10:02:59.148 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:59 localhost nova_compute[281952]: 2025-11-23 10:02:59.149 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:59 localhost kernel: device tapa3af0bf2-76 left promiscuous mode Nov 23 05:02:59 localhost nova_compute[281952]: 2025-11-23 10:02:59.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.188 263258 INFO neutron.agent.dhcp.agent [None req-3e44c0cc-7fc2-423c-9111-62e2572a2fa5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.189 263258 INFO neutron.agent.dhcp.agent [None req-3e44c0cc-7fc2-423c-9111-62e2572a2fa5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:02:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:59.196 2 INFO neutron.agent.securitygroups_rpc [None req-133cea40-228c-4af6-8f6b-d4d2d5a2eb51 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:02:59 localhost dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:02:59 localhost podman[326882]: 2025-11-23 10:02:59.302355271 +0000 UTC m=+0.065672613 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:02:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:59.350 2 INFO neutron.agent.securitygroups_rpc [None req-75de0287-b544-4fab-ad82-848c9edeaf4f fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:02:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.504 263258 INFO neutron.agent.dhcp.agent [None req-34e9f92e-d170-472c-ac70-7faa9e9aef37 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c', '924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:02:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:59.540 2 INFO neutron.agent.securitygroups_rpc [None req-94b4dea1-0a52-41d4-90a7-1f1aefba76c5 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['469976a2-fa36-45e6-842e-95bc93db1438']#033[00m Nov 23 05:02:59 localhost ovn_controller[154788]: 2025-11-23T10:02:59Z|00398|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:02:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:02:59.630 263258 INFO neutron.agent.dhcp.agent [None req-5070c634-78d7-43c8-ad22-2dbd05c7cbc5 - - - - - -] DHCP configuration for ports {'5f5b1b33-9dfa-4cc9-8666-90efeaefde4c'} is completed#033[00m Nov 23 05:02:59 localhost nova_compute[281952]: 2025-11-23 10:02:59.647 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:02:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:02:59.735 2 INFO neutron.agent.securitygroups_rpc [None req-d3ac2373-b722-49eb-839d-a87ede7d08ac fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:02:59 localhost dnsmasq[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:02:59 localhost dnsmasq-dhcp[326863]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:02:59 localhost podman[326922]: 2025-11-23 10:02:59.756724218 +0000 UTC m=+0.058015127 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 05:02:59 localhost openstack_network_exporter[242668]: ERROR 10:02:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:02:59 localhost openstack_network_exporter[242668]: ERROR 10:02:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:02:59 localhost openstack_network_exporter[242668]: ERROR 10:02:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:02:59 localhost openstack_network_exporter[242668]: ERROR 10:02:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:02:59 localhost openstack_network_exporter[242668]: Nov 23 05:02:59 localhost systemd[1]: var-lib-containers-storage-overlay-faeca6c7b92d4743d8ee7c6a15ec1a553cb548aadde1156e887861619142f300-merged.mount: Deactivated successfully. Nov 23 05:02:59 localhost systemd[1]: run-netns-qdhcp\x2d3f022321\x2d7f06\x2d4d92\x2d8d47\x2d80c086661f24.mount: Deactivated successfully. Nov 23 05:02:59 localhost openstack_network_exporter[242668]: ERROR 10:02:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:02:59 localhost openstack_network_exporter[242668]: Nov 23 05:03:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e138 e138: 6 total, 6 up, 6 in Nov 23 05:03:00 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:00.563 2 INFO neutron.agent.securitygroups_rpc [None req-6ed9d977-5486-448d-8f02-a426fdb94759 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:03:01 localhost dnsmasq[326863]: exiting on receipt of SIGTERM Nov 23 05:03:01 localhost systemd[1]: libpod-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope: Deactivated successfully. Nov 23 05:03:01 localhost podman[326958]: 2025-11-23 10:03:01.07601416 +0000 UTC m=+0.059675339 container kill c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:03:01 localhost podman[326970]: 2025-11-23 10:03:01.141984223 +0000 UTC m=+0.053623123 container died c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:03:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:01 localhost podman[326970]: 2025-11-23 10:03:01.172749891 +0000 UTC m=+0.084388751 container cleanup c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:03:01 localhost systemd[1]: libpod-conmon-c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97.scope: Deactivated successfully. Nov 23 05:03:01 localhost podman[326972]: 2025-11-23 10:03:01.227627951 +0000 UTC m=+0.131132950 container remove c84a31bd49c05e159f89da2816dd28afeff7b8d8aa8fe0eed567ed3598ebad97 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:03:01 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:01.564 2 INFO neutron.agent.securitygroups_rpc [None req-d75c0dda-e48d-4259-9ad6-e58217c5f7b4 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['da2a8078-9813-4e58-b587-8e4d75d37f47']#033[00m Nov 23 05:03:01 localhost nova_compute[281952]: 2025-11-23 10:03:01.729 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:01 localhost nova_compute[281952]: 2025-11-23 10:03:01.733 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost podman[327049]: Nov 23 05:03:02 localhost podman[327049]: 2025-11-23 10:03:02.060541309 +0000 UTC m=+0.064272051 container create f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:02 localhost systemd[1]: var-lib-containers-storage-overlay-ecece6215fcae68135ad77444848607179a7db1f65cb11807dd870c2d1e6781b-merged.mount: Deactivated successfully. Nov 23 05:03:02 localhost systemd[1]: Started libpod-conmon-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope. Nov 23 05:03:02 localhost systemd[1]: tmp-crun.aHO68g.mount: Deactivated successfully. Nov 23 05:03:02 localhost systemd[1]: Started libcrun container. Nov 23 05:03:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9d4bc8fd932f615977866e4cf411b544a5d942d899a713116fb40691af4610c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:02 localhost podman[327049]: 2025-11-23 10:03:02.122167308 +0000 UTC m=+0.125897820 container init f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:03:02 localhost podman[327049]: 2025-11-23 10:03:02.029874635 +0000 UTC m=+0.033605187 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:02 localhost podman[327049]: 2025-11-23 10:03:02.130122753 +0000 UTC m=+0.133853265 container start f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:02 localhost dnsmasq[327068]: started, version 2.85 cachesize 150 Nov 23 05:03:02 localhost dnsmasq[327068]: DNS service limited to local subnets Nov 23 05:03:02 localhost dnsmasq[327068]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:02 localhost dnsmasq[327068]: warning: no upstream servers configured Nov 23 05:03:02 localhost dnsmasq-dhcp[327068]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:03:02 localhost dnsmasq[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:03:02 localhost dnsmasq-dhcp[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:02 localhost dnsmasq-dhcp[327068]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:02.507 263258 INFO neutron.agent.dhcp.agent [None req-a291d9a4-9639-41e7-a6e4-11c4541309a3 - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:03:02 localhost dnsmasq[327068]: exiting on receipt of SIGTERM Nov 23 05:03:02 localhost podman[327085]: 2025-11-23 10:03:02.547144609 +0000 UTC m=+0.058023468 container kill f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:03:02 localhost systemd[1]: libpod-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope: Deactivated successfully. Nov 23 05:03:02 localhost podman[327101]: 2025-11-23 10:03:02.623839452 +0000 UTC m=+0.053915762 container died f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:03:02 localhost podman[327101]: 2025-11-23 10:03:02.666274209 +0000 UTC m=+0.096350479 container remove f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:03:02 localhost systemd[1]: libpod-conmon-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c.scope: Deactivated successfully. Nov 23 05:03:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:02.696 263258 INFO neutron.agent.linux.ip_lib [None req-8e2a35fc-7046-46e3-b20c-8ebe7ccbce4a - - - - - -] Device tapf178a34a-d3 cannot be used as it has no MAC address#033[00m Nov 23 05:03:02 localhost nova_compute[281952]: 2025-11-23 10:03:02.716 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost kernel: device tapf178a34a-d3 entered promiscuous mode Nov 23 05:03:02 localhost NetworkManager[5975]: [1763892182.7234] manager: (tapf178a34a-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Nov 23 05:03:02 localhost nova_compute[281952]: 2025-11-23 10:03:02.724 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost systemd-udevd[327136]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:02 localhost ovn_controller[154788]: 2025-11-23T10:03:02Z|00399|binding|INFO|Claiming lport f178a34a-d388-4d80-b907-596688f93fe4 for this chassis. Nov 23 05:03:02 localhost ovn_controller[154788]: 2025-11-23T10:03:02Z|00400|binding|INFO|f178a34a-d388-4d80-b907-596688f93fe4: Claiming unknown Nov 23 05:03:02 localhost ovn_controller[154788]: 2025-11-23T10:03:02Z|00401|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:02.738 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c36fb98c-d1bf-4015-90da-0f0d5f6a670b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f178a34a-d388-4d80-b907-596688f93fe4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:02.739 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f178a34a-d388-4d80-b907-596688f93fe4 in datapath 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 bound to our chassis#033[00m Nov 23 05:03:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:02.740 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:02 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:02.741 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[18227b7b-e9bf-4e19-9dc9-47cabd27eabb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:02 localhost ovn_controller[154788]: 2025-11-23T10:03:02Z|00402|binding|INFO|Setting lport f178a34a-d388-4d80-b907-596688f93fe4 ovn-installed in OVS Nov 23 05:03:02 localhost ovn_controller[154788]: 2025-11-23T10:03:02Z|00403|binding|INFO|Setting lport f178a34a-d388-4d80-b907-596688f93fe4 up in Southbound Nov 23 05:03:02 localhost nova_compute[281952]: 2025-11-23 10:03:02.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost nova_compute[281952]: 2025-11-23 10:03:02.814 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:02.824 2 INFO neutron.agent.securitygroups_rpc [None req-1875a888-d52f-4402-8541-5b1512187260 fc7a3bc93bd3430296903212121be4cd d68a0c4a9a444afd973c16a38020089b - - default default] Security group rule updated ['c4aad9b2-b8cd-4803-b28f-3e773406a427']#033[00m Nov 23 05:03:02 localhost nova_compute[281952]: 2025-11-23 10:03:02.842 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e139 e139: 6 total, 6 up, 6 in Nov 23 05:03:03 localhost systemd[1]: var-lib-containers-storage-overlay-a9d4bc8fd932f615977866e4cf411b544a5d942d899a713116fb40691af4610c-merged.mount: Deactivated successfully. Nov 23 05:03:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6ac65794602f0f7603f6f7af406a539a29bd1ce62f44653328e1fb049ae1c1c-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:03 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:03.493 2 INFO neutron.agent.securitygroups_rpc [None req-d59d1e53-d1c4-451d-80ab-ba4648fa7a20 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['cb577a71-e41d-409b-b673-d883cbdab535']#033[00m Nov 23 05:03:03 localhost podman[327198]: Nov 23 05:03:03 localhost podman[327198]: 2025-11-23 10:03:03.652758698 +0000 UTC m=+0.092186521 container create 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:03 localhost systemd[1]: Started libpod-conmon-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope. Nov 23 05:03:03 localhost podman[327198]: 2025-11-23 10:03:03.606930366 +0000 UTC m=+0.046358229 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:03 localhost systemd[1]: Started libcrun container. Nov 23 05:03:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f1a9a6b5db597e2d43b68f915d07dc93135e7537aca6fb565346ec09a7ab2f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:03 localhost podman[327198]: 2025-11-23 10:03:03.719013339 +0000 UTC m=+0.158441142 container init 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:03:03 localhost podman[327198]: 2025-11-23 10:03:03.724170818 +0000 UTC m=+0.163598631 container start 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:03:03 localhost dnsmasq[327226]: started, version 2.85 cachesize 150 Nov 23 05:03:03 localhost dnsmasq[327226]: DNS service limited to local subnets Nov 23 05:03:03 localhost dnsmasq[327226]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:03 localhost dnsmasq[327226]: warning: no upstream servers configured Nov 23 05:03:03 localhost dnsmasq-dhcp[327226]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:03:03 localhost dnsmasq[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/addn_hosts - 0 addresses Nov 23 05:03:03 localhost dnsmasq-dhcp[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/host Nov 23 05:03:03 localhost dnsmasq-dhcp[327226]: read /var/lib/neutron/dhcp/83a5d156-32a8-4c4d-8a44-164aa1c3b9d5/opts Nov 23 05:03:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:03.815 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 2eb60cfd-51cf-472d-b9e9-d6a48f126e7a with type ""#033[00m Nov 23 05:03:03 localhost ovn_controller[154788]: 2025-11-23T10:03:03Z|00404|binding|INFO|Removing iface tapf178a34a-d3 ovn-installed in OVS Nov 23 05:03:03 localhost ovn_controller[154788]: 2025-11-23T10:03:03Z|00405|binding|INFO|Removing lport f178a34a-d388-4d80-b907-596688f93fe4 ovn-installed in OVS Nov 23 05:03:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:03.817 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c36fb98c-d1bf-4015-90da-0f0d5f6a670b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f178a34a-d388-4d80-b907-596688f93fe4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:03.819 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f178a34a-d388-4d80-b907-596688f93fe4 in datapath 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 unbound from our chassis#033[00m Nov 23 05:03:03 localhost nova_compute[281952]: 2025-11-23 10:03:03.820 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:03.822 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 83a5d156-32a8-4c4d-8a44-164aa1c3b9d5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:03 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:03.823 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c14796ea-d423-46db-9b09-13869acaba6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:03 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:03.950 263258 INFO neutron.agent.dhcp.agent [None req-e9b077a5-7afd-47d8-b5f5-92d5dc34fa91 - - - - - -] DHCP configuration for ports {'eb1ed841-526e-4097-b1c7-63ca46f0e711'} is completed#033[00m Nov 23 05:03:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e140 e140: 6 total, 6 up, 6 in Nov 23 05:03:04 localhost ovn_controller[154788]: 2025-11-23T10:03:04Z|00406|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.181 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost systemd[1]: tmp-crun.eabhyS.mount: Deactivated successfully. Nov 23 05:03:04 localhost dnsmasq[327226]: exiting on receipt of SIGTERM Nov 23 05:03:04 localhost podman[327268]: 2025-11-23 10:03:04.184441447 +0000 UTC m=+0.094282596 container kill 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:03:04 localhost systemd[1]: libpod-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope: Deactivated successfully. Nov 23 05:03:04 localhost podman[327284]: Nov 23 05:03:04 localhost podman[327284]: 2025-11-23 10:03:04.240136943 +0000 UTC m=+0.104928214 container create 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:03:04 localhost podman[327298]: 2025-11-23 10:03:04.267766794 +0000 UTC m=+0.058228995 container died 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:03:04 localhost systemd[1]: Started libpod-conmon-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope. Nov 23 05:03:04 localhost systemd[1]: Started libcrun container. Nov 23 05:03:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55cdf0f78a503799b5700a918cf48dfcff386165f14819e636e88511fba01e53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:04 localhost podman[327284]: 2025-11-23 10:03:04.209750947 +0000 UTC m=+0.074542258 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:04 localhost podman[327298]: 2025-11-23 10:03:04.311537922 +0000 UTC m=+0.102000063 container remove 4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-83a5d156-32a8-4c4d-8a44-164aa1c3b9d5, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:03:04 localhost systemd[1]: libpod-conmon-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a.scope: Deactivated successfully. Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost kernel: device tapf178a34a-d3 left promiscuous mode Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.347 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost podman[327284]: 2025-11-23 10:03:04.353289858 +0000 UTC m=+0.218081159 container init 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:04 localhost podman[327284]: 2025-11-23 10:03:04.361860732 +0000 UTC m=+0.226652033 container start 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.363 263258 INFO neutron.agent.dhcp.agent [None req-9cc8ecab-e579-4f06-a478-17f4a501dcc1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.364 263258 INFO neutron.agent.dhcp.agent [None req-9cc8ecab-e579-4f06-a478-17f4a501dcc1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:04 localhost dnsmasq[327332]: started, version 2.85 cachesize 150 Nov 23 05:03:04 localhost dnsmasq[327332]: DNS service limited to local subnets Nov 23 05:03:04 localhost dnsmasq[327332]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:04 localhost dnsmasq[327332]: warning: no upstream servers configured Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:03:04 localhost dnsmasq[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.431 263258 INFO neutron.agent.dhcp.agent [None req-2fd858b7-7255-49f3-9187-bc2e5925323e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:02Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2464e9e2-44d8-4196-918c-708e8d6859f0, ip_allocation=immediate, mac_address=fa:16:3e:c6:53:18, name=tempest-PortsIpV6TestJSON-426008471, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:02:03Z, description=, dns_domain=, id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-274315444, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8521, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=1706, status=ACTIVE, subnets=['4e9cd968-5f62-4975-b489-a919561db40e', 'd6862539-34a2-478e-9e57-9c8909b1626a'], tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:00Z, vlan_transparent=None, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cb577a71-e41d-409b-b673-d883cbdab535'], standard_attr_id=2145, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:03Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a#033[00m Nov 23 05:03:04 localhost dnsmasq[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:04 localhost dnsmasq-dhcp[327332]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:04 localhost podman[327351]: 2025-11-23 10:03:04.618037224 +0000 UTC m=+0.056686158 container kill 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.788 263258 INFO neutron.agent.dhcp.agent [None req-6b0254d4-fb3c-43dd-b68a-b1edb6646469 - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.801 263258 INFO neutron.agent.linux.ip_lib [None req-c99f2a0f-b28f-4d20-970a-5732e585295c - - - - - -] Device tap061fe319-39 cannot be used as it has no MAC address#033[00m Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost kernel: device tap061fe319-39 entered promiscuous mode Nov 23 05:03:04 localhost systemd-udevd[327138]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:04 localhost NetworkManager[5975]: [1763892184.8344] manager: (tap061fe319-39): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost ovn_controller[154788]: 2025-11-23T10:03:04Z|00407|binding|INFO|Claiming lport 061fe319-398b-4b80-ab34-399e616693fc for this chassis. Nov 23 05:03:04 localhost ovn_controller[154788]: 2025-11-23T10:03:04Z|00408|binding|INFO|061fe319-398b-4b80-ab34-399e616693fc: Claiming unknown Nov 23 05:03:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:04.852 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2286c65-ff8f-4cd6-91b0-87853206477a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=061fe319-398b-4b80-ab34-399e616693fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:04.855 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 061fe319-398b-4b80-ab34-399e616693fc in datapath 6e10a0ab-0466-4df2-91a1-22e4b25912c9 bound to our chassis#033[00m Nov 23 05:03:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:04.858 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:04.859 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e3051371-e50a-45b2-933c-68dd00dd46d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost ovn_controller[154788]: 2025-11-23T10:03:04Z|00409|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc ovn-installed in OVS Nov 23 05:03:04 localhost ovn_controller[154788]: 2025-11-23T10:03:04Z|00410|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc up in Southbound Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.877 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost journal[230249]: ethtool ioctl error on tap061fe319-39: No such device Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.923 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost nova_compute[281952]: 2025-11-23 10:03:04.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:04.956 263258 INFO neutron.agent.dhcp.agent [None req-610ffc88-92d6-4276-9494-fdc9efff803c - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0'} is completed#033[00m Nov 23 05:03:05 localhost systemd[1]: var-lib-containers-storage-overlay-6f1a9a6b5db597e2d43b68f915d07dc93135e7537aca6fb565346ec09a7ab2f4-merged.mount: Deactivated successfully. Nov 23 05:03:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4fdcc0ee848400a6f952e6873d720165540ae0d288f2a48fa729fb2fd0137c2a-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:05 localhost systemd[1]: run-netns-qdhcp\x2d83a5d156\x2d32a8\x2d4c4d\x2d8a44\x2d164aa1c3b9d5.mount: Deactivated successfully. Nov 23 05:03:05 localhost podman[327448]: Nov 23 05:03:05 localhost podman[327448]: 2025-11-23 10:03:05.816672229 +0000 UTC m=+0.075371203 container create 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:03:05 localhost systemd[1]: Started libpod-conmon-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope. Nov 23 05:03:05 localhost systemd[1]: tmp-crun.ZayxQd.mount: Deactivated successfully. Nov 23 05:03:05 localhost systemd[1]: Started libcrun container. Nov 23 05:03:05 localhost podman[327448]: 2025-11-23 10:03:05.77616448 +0000 UTC m=+0.034863524 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba971e870817c0c1c87f8a7ba62f53f6dd751eec21c91b2626ace24c9cd01064/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:05 localhost podman[327448]: 2025-11-23 10:03:05.886589953 +0000 UTC m=+0.145288937 container init 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:05 localhost podman[327448]: 2025-11-23 10:03:05.895437625 +0000 UTC m=+0.154136569 container start 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:03:05 localhost dnsmasq[327466]: started, version 2.85 cachesize 150 Nov 23 05:03:05 localhost dnsmasq[327466]: DNS service limited to local subnets Nov 23 05:03:05 localhost dnsmasq[327466]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:05 localhost dnsmasq[327466]: warning: no upstream servers configured Nov 23 05:03:05 localhost dnsmasq-dhcp[327466]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:05 localhost dnsmasq[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/addn_hosts - 0 addresses Nov 23 05:03:05 localhost dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/host Nov 23 05:03:05 localhost dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/opts Nov 23 05:03:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:06.062 263258 INFO neutron.agent.dhcp.agent [None req-a1ab9fb7-8fbc-4132-8373-86c8c13f656e - - - - - -] DHCP configuration for ports {'945b3511-28ea-44ed-9bdc-8df44b5d381d'} is completed#033[00m Nov 23 05:03:06 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:06.167 263258 INFO neutron.agent.linux.ip_lib [None req-51664d74-6d4e-4769-b7da-c8cf858cc7a5 - - - - - -] Device tapb877a41e-a1 cannot be used as it has no MAC address#033[00m Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.188 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:06 localhost kernel: device tapb877a41e-a1 entered promiscuous mode Nov 23 05:03:06 localhost NetworkManager[5975]: [1763892186.1946] manager: (tapb877a41e-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.198 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:06 localhost ovn_controller[154788]: 2025-11-23T10:03:06Z|00411|binding|INFO|Claiming lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 for this chassis. Nov 23 05:03:06 localhost ovn_controller[154788]: 2025-11-23T10:03:06Z|00412|binding|INFO|b877a41e-a15e-4b64-bd71-6cac8fa9c439: Claiming unknown Nov 23 05:03:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:06.210 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3ed04a-a453-47aa-85a6-639016eec1c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b877a41e-a15e-4b64-bd71-6cac8fa9c439) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:06.213 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b877a41e-a15e-4b64-bd71-6cac8fa9c439 in datapath 8f0b7507-4636-45b3-a4b2-9fdd55176a18 bound to our chassis#033[00m Nov 23 05:03:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:06.214 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f0b7507-4636-45b3-a4b2-9fdd55176a18 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:06 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:06.215 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d40215-2eca-4c21-adc2-4601ea229b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:06 localhost ovn_controller[154788]: 2025-11-23T10:03:06Z|00413|binding|INFO|Setting lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 ovn-installed in OVS Nov 23 05:03:06 localhost ovn_controller[154788]: 2025-11-23T10:03:06Z|00414|binding|INFO|Setting lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 up in Southbound Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.243 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:06 localhost dnsmasq[327332]: exiting on receipt of SIGTERM Nov 23 05:03:06 localhost podman[327491]: 2025-11-23 10:03:06.250914596 +0000 UTC m=+0.075633831 container kill 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:03:06 localhost systemd[1]: libpod-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope: Deactivated successfully. Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.284 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.320 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:06 localhost podman[327514]: 2025-11-23 10:03:06.32832651 +0000 UTC m=+0.057148001 container died 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:03:06 localhost podman[327514]: 2025-11-23 10:03:06.399500573 +0000 UTC m=+0.128322004 container remove 5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:03:06 localhost podman[327540]: 2025-11-23 10:03:06.455979253 +0000 UTC m=+0.090386106 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:03:06 localhost systemd[1]: libpod-conmon-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2.scope: Deactivated successfully. Nov 23 05:03:06 localhost podman[327542]: 2025-11-23 10:03:06.435050978 +0000 UTC m=+0.065543340 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3) Nov 23 05:03:06 localhost podman[327542]: 2025-11-23 10:03:06.518270941 +0000 UTC m=+0.148763303 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:03:06 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:03:06 localhost podman[327540]: 2025-11-23 10:03:06.539707272 +0000 UTC m=+0.174114155 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:03:06 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:03:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:06.640 2 INFO neutron.agent.securitygroups_rpc [None req-3cc8ae58-6898-4e12-983b-8f9645bd7e63 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['a9134bcb-5194-43f8-ac1f-875c59af23f5', '736c2f34-1be4-42fe-9283-c00aaa4f421b', 'cb577a71-e41d-409b-b673-d883cbdab535']#033[00m Nov 23 05:03:06 localhost nova_compute[281952]: 2025-11-23 10:03:06.755 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost systemd[1]: var-lib-containers-storage-overlay-55cdf0f78a503799b5700a918cf48dfcff386165f14819e636e88511fba01e53-merged.mount: Deactivated successfully. Nov 23 05:03:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b4698f7fc521d49639d15f13e02daa5446d6ac66d7c9fcc0273e6b4f11850d2-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:07 localhost ovn_controller[154788]: 2025-11-23T10:03:07Z|00415|binding|INFO|Releasing lport 061fe319-398b-4b80-ab34-399e616693fc from this chassis (sb_readonly=0) Nov 23 05:03:07 localhost kernel: device tap061fe319-39 left promiscuous mode Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.137 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost ovn_controller[154788]: 2025-11-23T10:03:07Z|00416|binding|INFO|Setting lport 061fe319-398b-4b80-ab34-399e616693fc down in Southbound Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.146 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fc6931e9-753f-4431-a2fb-7a1d4a7833f7 with type ""#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.148 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6e10a0ab-0466-4df2-91a1-22e4b25912c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2286c65-ff8f-4cd6-91b0-87853206477a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=061fe319-398b-4b80-ab34-399e616693fc) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.150 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 061fe319-398b-4b80-ab34-399e616693fc in datapath 6e10a0ab-0466-4df2-91a1-22e4b25912c9 unbound from our chassis#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.153 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6e10a0ab-0466-4df2-91a1-22e4b25912c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.154 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[04afa2ed-8717-48eb-8ba0-c1958230ae5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost podman[327653]: Nov 23 05:03:07 localhost podman[327653]: 2025-11-23 10:03:07.280850553 +0000 UTC m=+0.096622897 container create 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:03:07 localhost systemd[1]: Started libpod-conmon-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope. Nov 23 05:03:07 localhost podman[327653]: 2025-11-23 10:03:07.23401199 +0000 UTC m=+0.049784374 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:07 localhost systemd[1]: Started libcrun container. Nov 23 05:03:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fcce9fbacd628dae8de270527f13d43879fe58249a6413d8bf69abd83505308/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:07 localhost podman[327653]: 2025-11-23 10:03:07.363946713 +0000 UTC m=+0.179719057 container init 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:03:07 localhost podman[327653]: 2025-11-23 10:03:07.369602847 +0000 UTC m=+0.185375191 container start 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:03:07 localhost dnsmasq[327677]: started, version 2.85 cachesize 150 Nov 23 05:03:07 localhost dnsmasq[327677]: DNS service limited to local subnets Nov 23 05:03:07 localhost dnsmasq[327677]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:07 localhost dnsmasq[327677]: warning: no upstream servers configured Nov 23 05:03:07 localhost dnsmasq-dhcp[327677]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:03:07 localhost dnsmasq[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/addn_hosts - 0 addresses Nov 23 05:03:07 localhost dnsmasq-dhcp[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/host Nov 23 05:03:07 localhost dnsmasq-dhcp[327677]: read /var/lib/neutron/dhcp/8f0b7507-4636-45b3-a4b2-9fdd55176a18/opts Nov 23 05:03:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e141 e141: 6 total, 6 up, 6 in Nov 23 05:03:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.496 263258 INFO neutron.agent.dhcp.agent [None req-43c6540c-db3e-441f-8f3d-4ab6973ba6cc - - - - - -] DHCP configuration for ports {'96a5e037-97e8-4711-9d19-9b28623d3884'} is completed#033[00m Nov 23 05:03:07 localhost ovn_controller[154788]: 2025-11-23T10:03:07Z|00417|binding|INFO|Removing iface tapb877a41e-a1 ovn-installed in OVS Nov 23 05:03:07 localhost ovn_controller[154788]: 2025-11-23T10:03:07Z|00418|binding|INFO|Removing lport b877a41e-a15e-4b64-bd71-6cac8fa9c439 ovn-installed in OVS Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.542 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 81f96baf-17cd-4ec7-94c4-dd9a0166c427 with type ""#033[00m Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.544 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f0b7507-4636-45b3-a4b2-9fdd55176a18', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e3ed04a-a453-47aa-85a6-639016eec1c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b877a41e-a15e-4b64-bd71-6cac8fa9c439) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.546 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b877a41e-a15e-4b64-bd71-6cac8fa9c439 in datapath 8f0b7507-4636-45b3-a4b2-9fdd55176a18 unbound from our chassis#033[00m Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.548 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f0b7507-4636-45b3-a4b2-9fdd55176a18 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:07.549 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5218a3-a844-4f99-8b92-75f06e563fdf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:07 localhost podman[327714]: Nov 23 05:03:07 localhost podman[327714]: 2025-11-23 10:03:07.698388165 +0000 UTC m=+0.092389357 container create 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:03:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:07.724 2 INFO neutron.agent.securitygroups_rpc [None req-63da6581-da02-4188-8e89-eab01a812a16 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['736c2f34-1be4-42fe-9283-c00aaa4f421b', 'a9134bcb-5194-43f8-ac1f-875c59af23f5']#033[00m Nov 23 05:03:07 localhost systemd[1]: Started libpod-conmon-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope. Nov 23 05:03:07 localhost systemd[1]: Started libcrun container. Nov 23 05:03:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cf829e7df822dea19af929cb8462e294ad3327afd2b8aab63876759ece9d62b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:07 localhost podman[327714]: 2025-11-23 10:03:07.655285468 +0000 UTC m=+0.049286670 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:07 localhost dnsmasq[327677]: exiting on receipt of SIGTERM Nov 23 05:03:07 localhost podman[327725]: 2025-11-23 10:03:07.764509403 +0000 UTC m=+0.117895293 container kill 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:03:07 localhost systemd[1]: libpod-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope: Deactivated successfully. Nov 23 05:03:07 localhost podman[327714]: 2025-11-23 10:03:07.808585191 +0000 UTC m=+0.202586383 container init 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:07 localhost podman[327714]: 2025-11-23 10:03:07.819705633 +0000 UTC m=+0.213706825 container start 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:03:07 localhost dnsmasq[327769]: started, version 2.85 cachesize 150 Nov 23 05:03:07 localhost dnsmasq[327769]: DNS service limited to local subnets Nov 23 05:03:07 localhost dnsmasq[327769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:07 localhost dnsmasq[327769]: warning: no upstream servers configured Nov 23 05:03:07 localhost dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d Nov 23 05:03:07 localhost dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:03:07 localhost dnsmasq-dhcp[327769]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:03:07 localhost dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:03:07 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:07 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:07 localhost podman[327748]: 2025-11-23 10:03:07.858118206 +0000 UTC m=+0.070623787 container died 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:03:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.888 263258 INFO neutron.agent.dhcp.agent [None req-8688bc5c-3dea-4cc6-99fd-8cc9764024f1 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:02Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2464e9e2-44d8-4196-918c-708e8d6859f0, ip_allocation=immediate, mac_address=fa:16:3e:c6:53:18, name=tempest-PortsIpV6TestJSON-194744152, network_id=0d2bb8b4-9b3e-41c7-b595-54664cfb433a, port_security_enabled=True, project_id=1e3b93ef61044aafb71b30163a32d7ac, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['736c2f34-1be4-42fe-9283-c00aaa4f421b', 'a9134bcb-5194-43f8-ac1f-875c59af23f5'], standard_attr_id=2145, status=DOWN, tags=[], tenant_id=1e3b93ef61044aafb71b30163a32d7ac, updated_at=2025-11-23T10:03:06Z on network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a#033[00m Nov 23 05:03:07 localhost podman[327748]: 2025-11-23 10:03:07.900617505 +0000 UTC m=+0.113123046 container remove 35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f0b7507-4636-45b3-a4b2-9fdd55176a18, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:03:07 localhost systemd[1]: libpod-conmon-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0.scope: Deactivated successfully. Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost kernel: device tapb877a41e-a1 left promiscuous mode Nov 23 05:03:07 localhost nova_compute[281952]: 2025-11-23 10:03:07.968 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.981 263258 INFO neutron.agent.dhcp.agent [None req-69d033d8-169e-45e6-ac25-65fe4ac66d6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:07.982 263258 INFO neutron.agent.dhcp.agent [None req-69d033d8-169e-45e6-ac25-65fe4ac66d6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.015 263258 INFO neutron.agent.dhcp.agent [None req-1b1b37ba-85ae-4a3b-87a2-55c9bd34e64d - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0', '924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:03:08 localhost ovn_controller[154788]: 2025-11-23T10:03:08Z|00419|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:08 localhost dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 1 addresses Nov 23 05:03:08 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:08 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:08 localhost podman[327790]: 2025-11-23 10:03:08.050009458 +0000 UTC m=+0.059187455 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 23 05:03:08 localhost systemd[1]: var-lib-containers-storage-overlay-2fcce9fbacd628dae8de270527f13d43879fe58249a6413d8bf69abd83505308-merged.mount: Deactivated successfully. Nov 23 05:03:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35ef7495408d240912d0128e1fdcf80bf6c2383c3d137b396219616e74c718a0-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:08 localhost systemd[1]: run-netns-qdhcp\x2d8f0b7507\x2d4636\x2d45b3\x2da4b2\x2d9fdd55176a18.mount: Deactivated successfully. Nov 23 05:03:08 localhost nova_compute[281952]: 2025-11-23 10:03:08.083 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:08 localhost dnsmasq[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/addn_hosts - 0 addresses Nov 23 05:03:08 localhost dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/host Nov 23 05:03:08 localhost dnsmasq-dhcp[327466]: read /var/lib/neutron/dhcp/6e10a0ab-0466-4df2-91a1-22e4b25912c9/opts Nov 23 05:03:08 localhost podman[327821]: 2025-11-23 10:03:08.212693379 +0000 UTC m=+0.067059837 container kill 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:03:08 localhost systemd[1]: tmp-crun.fS6y5m.mount: Deactivated successfully. Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent [None req-79cd50a9-85d4-4b74-9e50-a1a32448a522 - - - - - -] Unable to reload_allocations dhcp for 6e10a0ab-0466-4df2-91a1-22e4b25912c9.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap061fe319-39 not found in namespace qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9. Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent return fut.result() Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent raise self._exception Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap061fe319-39 not found in namespace qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9. Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.242 263258 ERROR neutron.agent.dhcp.agent #033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.343 263258 INFO neutron.agent.dhcp.agent [None req-2e37a1d0-27d1-4e9b-91ba-b35a2c6cf695 - - - - - -] DHCP configuration for ports {'2464e9e2-44d8-4196-918c-708e8d6859f0'} is completed#033[00m Nov 23 05:03:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:08 localhost dnsmasq[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:03:08 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:08 localhost dnsmasq-dhcp[327769]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:08 localhost podman[327856]: 2025-11-23 10:03:08.418275922 +0000 UTC m=+0.059622348 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.579 263258 INFO neutron.agent.dhcp.agent [None req-54274fa7-429b-40de-9645-e5a1e1ac12d0 - - - - - -] Synchronizing state#033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.774 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.775 263258 INFO neutron.agent.dhcp.agent [-] Starting network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 dhcp configuration#033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.777 263258 INFO neutron.agent.dhcp.agent [-] Finished network 6e10a0ab-0466-4df2-91a1-22e4b25912c9 dhcp configuration#033[00m Nov 23 05:03:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:08.777 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] Synchronizing state complete#033[00m Nov 23 05:03:09 localhost dnsmasq[327466]: exiting on receipt of SIGTERM Nov 23 05:03:09 localhost podman[327893]: 2025-11-23 10:03:09.013361414 +0000 UTC m=+0.058415401 container kill 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:09 localhost systemd[1]: libpod-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope: Deactivated successfully. Nov 23 05:03:09 localhost podman[327907]: 2025-11-23 10:03:09.083541506 +0000 UTC m=+0.053822160 container died 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:09 localhost podman[327907]: 2025-11-23 10:03:09.113838689 +0000 UTC m=+0.084119303 container cleanup 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:03:09 localhost systemd[1]: libpod-conmon-4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543.scope: Deactivated successfully. Nov 23 05:03:09 localhost podman[327908]: 2025-11-23 10:03:09.165062627 +0000 UTC m=+0.130557053 container remove 4debf23d8658de6ce41887a26ef9b5ca75e0d66e3bfb39121172f8558bbab543 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6e10a0ab-0466-4df2-91a1-22e4b25912c9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:03:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:09.301 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:03:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:03:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:03:09 localhost dnsmasq[327769]: exiting on receipt of SIGTERM Nov 23 05:03:09 localhost podman[327954]: 2025-11-23 10:03:09.462047946 +0000 UTC m=+0.060896527 container kill 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:03:09 localhost systemd[1]: libpod-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope: Deactivated successfully. Nov 23 05:03:09 localhost podman[327966]: 2025-11-23 10:03:09.525775879 +0000 UTC m=+0.053101287 container died 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:03:09 localhost podman[327966]: 2025-11-23 10:03:09.556302299 +0000 UTC m=+0.083627657 container cleanup 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:09 localhost systemd[1]: libpod-conmon-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688.scope: Deactivated successfully. Nov 23 05:03:09 localhost podman[327973]: 2025-11-23 10:03:09.580095642 +0000 UTC m=+0.094583425 container remove 11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:03:09 localhost nova_compute[281952]: 2025-11-23 10:03:09.719 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:10 localhost systemd[1]: var-lib-containers-storage-overlay-3cf829e7df822dea19af929cb8462e294ad3327afd2b8aab63876759ece9d62b-merged.mount: Deactivated successfully. Nov 23 05:03:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11ac7e75b6b22e9ada814519331e72414cb1613225834f786c3ee57b199af688-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:10 localhost systemd[1]: var-lib-containers-storage-overlay-ba971e870817c0c1c87f8a7ba62f53f6dd751eec21c91b2626ace24c9cd01064-merged.mount: Deactivated successfully. Nov 23 05:03:10 localhost systemd[1]: run-netns-qdhcp\x2d6e10a0ab\x2d0466\x2d4df2\x2d91a1\x2d22e4b25912c9.mount: Deactivated successfully. Nov 23 05:03:10 localhost podman[328046]: Nov 23 05:03:10 localhost podman[328046]: 2025-11-23 10:03:10.47014584 +0000 UTC m=+0.084753532 container create 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:03:10 localhost systemd[1]: Started libpod-conmon-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope. Nov 23 05:03:10 localhost podman[328046]: 2025-11-23 10:03:10.430715446 +0000 UTC m=+0.045323188 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:10 localhost systemd[1]: tmp-crun.OYJzP2.mount: Deactivated successfully. Nov 23 05:03:10 localhost systemd[1]: Started libcrun container. Nov 23 05:03:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ba24ef5a8302e2b3c20cfab428fcbe8c485862935b4b78879171770c8fc9f10/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:10 localhost podman[328046]: 2025-11-23 10:03:10.562482705 +0000 UTC m=+0.177090417 container init 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:03:10 localhost podman[328046]: 2025-11-23 10:03:10.573307987 +0000 UTC m=+0.187915679 container start 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:10 localhost dnsmasq[328064]: started, version 2.85 cachesize 150 Nov 23 05:03:10 localhost dnsmasq[328064]: DNS service limited to local subnets Nov 23 05:03:10 localhost dnsmasq[328064]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:10 localhost dnsmasq[328064]: warning: no upstream servers configured Nov 23 05:03:10 localhost dnsmasq-dhcp[328064]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d Nov 23 05:03:10 localhost dnsmasq-dhcp[328064]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 23 05:03:10 localhost dnsmasq[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/addn_hosts - 0 addresses Nov 23 05:03:10 localhost dnsmasq-dhcp[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/host Nov 23 05:03:10 localhost dnsmasq-dhcp[328064]: read /var/lib/neutron/dhcp/0d2bb8b4-9b3e-41c7-b595-54664cfb433a/opts Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.808 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.814 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0e3e868-e2f4-4395-addc-ef5cd04b7c7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.810037', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e0fac2e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '0335b02f310bfcdf7fb13700cf0a116f65787ddc94335ef2946d73bc7307270e'}]}, 'timestamp': '2025-11-23 10:03:10.815664', '_unique_id': 'a73f099a69ff4389b375e8caaf135c21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.817 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.818 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fec2fbc-75c6-470a-bc9f-066308d0d288', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.818679', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1039e6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'fe75d2e566ddf4e4919a2a32930ffceacbf3c4d040ab2291dcbc3d5b6aab0c7c'}]}, 'timestamp': '2025-11-23 10:03:10.819270', '_unique_id': 'af46d71b12f04db29b05d37a0edca048'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.820 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.821 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a98328c3-ddea-4a4f-9a62-82d99387bcb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.821635', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e10b024-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '6d1bd2f83efed2d6bc6a43d7fab8d24a5cb4283c8905ee2c5825223989be1418'}]}, 'timestamp': '2025-11-23 10:03:10.822339', '_unique_id': '4227c147df6942118c616ecd1941cfc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.823 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.837 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.838 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155ed4a9-b6fe-46d3-a592-7b700e7c46f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.824988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e132c64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '16ae143cd5222606699319d2a4b38d098598cf0a6262becbdd2b44c6fe9d758f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.824988', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1340d2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'c3548e4d3449b56c6c5e370738e57d37cc528009c54af3765d2b7ff65842a0c3'}]}, 'timestamp': '2025-11-23 10:03:10.839077', '_unique_id': 'd336a5606db146b79912be48ac751e21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.840 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.841 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.855 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 17230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8d1c971-59c4-4953-b494-93f271501b02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17230000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:03:10.841688', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9e15d82e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.03303766, 'message_signature': '31ec03992c52a90895bb912d367e9267d6adf5cc8539e6fb8ad50b68a7ba1d34'}]}, 'timestamp': '2025-11-23 10:03:10.856080', '_unique_id': '24730bf0199c4960b30d87fab01278ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.857 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.858 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a3d47e-50eb-4977-942d-05a7f131292c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.858527', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1653f8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'c2224f4c52ca879f47a5acf705ecfeafa93e7078b8c75a2797a837dfed075367'}]}, 'timestamp': '2025-11-23 10:03:10.859204', '_unique_id': 'bcd61683894845b5a9ca7014f084f314'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.860 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.861 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b99e09a5-ab29-40ca-b34d-dc1bfc27f4b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.861702', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e16cc5c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '8f5fbb2a8a1c51ca60d7c7415844da67bc851d8fe5aded0eccdff06a888d90b3'}]}, 'timestamp': '2025-11-23 10:03:10.862281', '_unique_id': '74084e462d4c448aa08436e624c6b149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.863 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.864 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d32774b-4e80-4285-8d02-b4f32f5e94a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.864587', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e173e80-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'de3ce55ecc478b07fabdd93662127e3d2042338cc9fe17fe9dc9777c291b8a93'}]}, 'timestamp': '2025-11-23 10:03:10.865220', '_unique_id': 'd5682f46d5984aa9b6989c8ae62e1a80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.893 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38dc6279-905e-42b9-841d-5b0e3f78fbd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.867519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1b9c64-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'f79adf0318f1ae087e1dec08f32ef44f5e37e8ce08c53c6abeb9363ee36d7a45'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.867519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1bb00a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '8f5015d2f818ae65a5024908fb86d81a0ee8471835cf5baaa8c08ed84cab9421'}]}, 'timestamp': '2025-11-23 10:03:10.894296', '_unique_id': '9809d682e77740eba93a34b8123cb8a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.895 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.897 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0bb7a02-9a1c-4bcc-93d7-f5295218a85e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.897201', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1c3336-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '68764f4421e94c394f727a6340be7e9109ea814b78bd5e3f662536d8a55778fb'}]}, 'timestamp': '2025-11-23 10:03:10.897676', '_unique_id': '0b2c9551c7064438951a4839e1963070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.898 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.899 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.899 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8588c11-7131-42c6-8d85-4eb2c4f3f649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.899832', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1c9b46-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '09f5735f5a39fbb5beb1818e509d9f7165cae33d94f8c2a69cb5ca892043865c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.899832', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1cab9a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'd7c56293bdf7a6ecc32de31ccfcb384b26443e7c47ccc5629c22ff7ce580a658'}]}, 'timestamp': '2025-11-23 10:03:10.900755', '_unique_id': 'c59342cadb7744e5893ac8a6827d3060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.901 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.902 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f78c272e-a4e3-4c02-8680-69396dba8631', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.902986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1d14c2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '6c051522517119bbefc94e369f7e06e0ac1ed3d4eb9f4de15e9d735d7dabacaf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.902986', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1d24da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'c3ed51fdef33b26a958cf77bdf45b4c5ef669a1b3f2c9c6dad7aedfe2a802b20'}]}, 'timestamp': '2025-11-23 10:03:10.903829', '_unique_id': '7ab8107b682849978f9384aecb8adee0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.904 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.906 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae1e8ca7-b27a-4a86-8351-b4b72178b5f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.906419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1d9adc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'ed2b899cad525eccc4fe90b9deaeb4a88acf0e25f24f2d95a5a89d7b7cb6997a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.906419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1dacac-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '80c0fcff1345accb9b291fd07f97521047963e83175eba79ea06c7b99843882d'}]}, 'timestamp': '2025-11-23 10:03:10.907307', '_unique_id': '3017044fb6424ded9b841968858aab4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.908 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.909 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5160dda2-c08c-4b81-ae81-d48a3cb36302', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.909642', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1e189a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': 'f76dd5340a3132a4cc840dc996b39adadcff92960c3c401939185450614a98a0'}]}, 'timestamp': '2025-11-23 10:03:10.910144', '_unique_id': '829541342d574ef7ae2fa1c819aa94f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.911 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b61c6cf-3d1f-4419-9dca-8f5618a68875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.912305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1e8122-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': 'eb636a3863bfff4c54df122350b58ee6e0bc917abffb7a9c36dff6d8571e76af'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.912305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1e9400-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.002661504, 'message_signature': '35870cfed0939f8a3c66c77e686e11c1fa68a5946c86853217bd4952866a4c54'}]}, 'timestamp': '2025-11-23 10:03:10.913231', '_unique_id': '901d85a0f1d74179a8d7594ac0323878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.914 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd25ee790-4f32-4369-9d69-6db82930f72e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.915427', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e1efad0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '74d4ea274c458856377c887369cb55bd06cc27be79d793e84818a5971444a2c9'}]}, 'timestamp': '2025-11-23 10:03:10.915921', '_unique_id': '0befcbbfb24548ed9cc4ac30b74204d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.918 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85be9042-f65c-4e9f-8606-4958c7133d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:03:10.918228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9e1f6830-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.03303766, 'message_signature': 'b215422ac38a4b983734c6d3c48042d57a2137de35bfef95bc66bf76151af8c6'}]}, 'timestamp': '2025-11-23 10:03:10.918672', '_unique_id': '6279c0b69b9a4e7e9058406b40198e56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9606e24-6251-4778-bc3e-f7edfa05ef9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.920869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e1fd0d6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '18a9f388942341ab7f653183b0896b749e7e27f7a4945881aae9c0c854e48d6d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.920869', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e1fe15c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '80bf3b7d5c0dcfa1a0e315f756e3a17489fadf056db425d8a31a3173bdb1f0f5'}]}, 'timestamp': '2025-11-23 10:03:10.921763', '_unique_id': 'd6120e5bb07949bc880164dd562c4a22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '283940b8-1d53-47a0-8b7e-6a280abf2f3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.924548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e20606e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'e4f945f9cd9858d433ab0ad7e0285ac2103ed25e402598471ed612259b42757d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.924548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e207310-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '890cd471dfdac864f537369fbd3eb6fb915b147f1a3011f6c4c310677f013ebc'}]}, 'timestamp': '2025-11-23 10:03:10.925497', '_unique_id': 'd8ec1eeceb68418d8732675db67ca790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '096a7b19-8455-44b4-9c6e-95b76600a2c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:03:10.927714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9e20dbfc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': 'be6e5072d52e5523d02c00f1cd25352c711a2a3c38ed978f10bea3b7a32b514b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:03:10.927714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9e20ecdc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12134.045150974, 'message_signature': '6d30d8842ca5f596da206f2c42ef984c42a8affd2c2ff626f65e9334aa5eb0c2'}]}, 'timestamp': '2025-11-23 10:03:10.928614', '_unique_id': '32e2ad4ec1ac4cb5ab7126c6f3280b34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd2a71df-e17e-42fd-a627-af76004d1dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:03:10.930506', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '9e2144b6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12133.987669333, 'message_signature': '32aafc37a153060eced10c2b3c0674d2d6e1300aab82d6cffcc90c04c81037a4'}]}, 'timestamp': '2025-11-23 10:03:10.930802', '_unique_id': '9f232413dc1c423eb554c9cc329b66e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:03:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:03:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:03:11 localhost dnsmasq[328064]: exiting on receipt of SIGTERM Nov 23 05:03:11 localhost systemd[1]: tmp-crun.vbunDl.mount: Deactivated successfully. Nov 23 05:03:11 localhost podman[328082]: 2025-11-23 10:03:11.181583337 +0000 UTC m=+0.052220530 container kill 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:11 localhost systemd[1]: libpod-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope: Deactivated successfully. Nov 23 05:03:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.230 263258 INFO neutron.agent.dhcp.agent [None req-020f4ddf-6b4e-44a2-b39b-a1ca5e7a7bec - - - - - -] DHCP configuration for ports {'924d747d-3069-493d-890d-d22289f6cb63', 'c4a920d0-ea35-484e-b16d-855b3c409327', 'eae593c4-b892-466d-a341-fdc33951a395'} is completed#033[00m Nov 23 05:03:11 localhost podman[328095]: 2025-11-23 10:03:11.241554004 +0000 UTC m=+0.047691280 container died 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:03:11 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:11.252 2 INFO neutron.agent.securitygroups_rpc [None req-261eb8d2-9d50-4b0d-8acd-8f9cb046e671 80c914b51bd84d48bc00092fc8abcee7 1e3b93ef61044aafb71b30163a32d7ac - - default default] Security group member updated ['d7ead8f7-80d5-4103-ab91-19b87956485a']#033[00m Nov 23 05:03:11 localhost podman[328095]: 2025-11-23 10:03:11.267296367 +0000 UTC m=+0.073410703 container cleanup 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:11 localhost systemd[1]: libpod-conmon-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9.scope: Deactivated successfully. Nov 23 05:03:11 localhost podman[328097]: 2025-11-23 10:03:11.337166099 +0000 UTC m=+0.137042793 container remove 47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0d2bb8b4-9b3e-41c7-b595-54664cfb433a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:11 localhost nova_compute[281952]: 2025-11-23 10:03:11.348 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:11 localhost kernel: device tap924d747d-30 left promiscuous mode Nov 23 05:03:11 localhost ovn_controller[154788]: 2025-11-23T10:03:11Z|00420|binding|INFO|Releasing lport 924d747d-3069-493d-890d-d22289f6cb63 from this chassis (sb_readonly=0) Nov 23 05:03:11 localhost ovn_controller[154788]: 2025-11-23T10:03:11Z|00421|binding|INFO|Setting lport 924d747d-3069-493d-890d-d22289f6cb63 down in Southbound Nov 23 05:03:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:11.361 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8:0:2::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d2bb8b4-9b3e-41c7-b595-54664cfb433a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e3b93ef61044aafb71b30163a32d7ac', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c46eb8d9-f4c4-4e34-a4e0-3a70a48cb8ea, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=924d747d-3069-493d-890d-d22289f6cb63) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:11.362 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 924d747d-3069-493d-890d-d22289f6cb63 in datapath 0d2bb8b4-9b3e-41c7-b595-54664cfb433a unbound from our chassis#033[00m Nov 23 05:03:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:11.364 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0d2bb8b4-9b3e-41c7-b595-54664cfb433a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:11 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:11.365 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5567c206-9958-4255-a311-92243606c033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:11 localhost nova_compute[281952]: 2025-11-23 10:03:11.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:11 localhost nova_compute[281952]: 2025-11-23 10:03:11.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.853 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.854 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.854 263258 INFO neutron.agent.dhcp.agent [None req-bbe7d4cd-ac9e-415c-9900-b81395b5eb50 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:11 localhost podman[240668]: time="2025-11-23T10:03:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:03:11 localhost podman[240668]: @ - - [23/Nov/2025:10:03:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159325 "" "Go-http-client/1.1" Nov 23 05:03:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:11.942 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:11 localhost podman[240668]: @ - - [23/Nov/2025:10:03:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20198 "" "Go-http-client/1.1" Nov 23 05:03:11 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e142 e142: 6 total, 6 up, 6 in Nov 23 05:03:12 localhost systemd[1]: var-lib-containers-storage-overlay-7ba24ef5a8302e2b3c20cfab428fcbe8c485862935b4b78879171770c8fc9f10-merged.mount: Deactivated successfully. Nov 23 05:03:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47c9128184f5b90443e4b1c51408f28face4a6046b4fbf9233e4295d2b601fe9-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:12 localhost systemd[1]: run-netns-qdhcp\x2d0d2bb8b4\x2d9b3e\x2d41c7\x2db595\x2d54664cfb433a.mount: Deactivated successfully. Nov 23 05:03:12 localhost nova_compute[281952]: 2025-11-23 10:03:12.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:12 localhost ovn_controller[154788]: 2025-11-23T10:03:12Z|00422|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:12 localhost nova_compute[281952]: 2025-11-23 10:03:12.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e143 e143: 6 total, 6 up, 6 in Nov 23 05:03:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:13.093 263258 INFO neutron.agent.linux.ip_lib [None req-6fb7b696-855a-49d5-9b13-4f354ffb2709 - - - - - -] Device tap7b7efa3a-46 cannot be used as it has no MAC address#033[00m Nov 23 05:03:13 localhost nova_compute[281952]: 2025-11-23 10:03:13.170 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:13 localhost kernel: device tap7b7efa3a-46 entered promiscuous mode Nov 23 05:03:13 localhost ovn_controller[154788]: 2025-11-23T10:03:13Z|00423|binding|INFO|Claiming lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 for this chassis. Nov 23 05:03:13 localhost ovn_controller[154788]: 2025-11-23T10:03:13Z|00424|binding|INFO|7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36: Claiming unknown Nov 23 05:03:13 localhost NetworkManager[5975]: [1763892193.1780] manager: (tap7b7efa3a-46): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Nov 23 05:03:13 localhost nova_compute[281952]: 2025-11-23 10:03:13.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:13 localhost systemd-udevd[328133]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:13.193 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed19edd7-37c8-4522-837f-b32faf388727, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:13.194 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 in datapath 61421cbd-202e-4179-bacd-7ac7c4ff8280 bound to our chassis#033[00m Nov 23 05:03:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:13.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port a6c5f0c7-9090-48fd-88ac-c8698fa30062 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:03:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:13.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61421cbd-202e-4179-bacd-7ac7c4ff8280, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:03:13 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:13.198 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5582a4c5-0c42-4ab8-b34b-03d2a1075a31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:13 localhost ovn_controller[154788]: 2025-11-23T10:03:13Z|00425|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 ovn-installed in OVS Nov 23 05:03:13 localhost ovn_controller[154788]: 2025-11-23T10:03:13Z|00426|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 up in Southbound Nov 23 05:03:13 localhost nova_compute[281952]: 2025-11-23 10:03:13.215 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:13 localhost nova_compute[281952]: 2025-11-23 10:03:13.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:13 localhost nova_compute[281952]: 2025-11-23 10:03:13.288 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:14 localhost podman[328188]: Nov 23 05:03:14 localhost podman[328188]: 2025-11-23 10:03:14.12431696 +0000 UTC m=+0.088312701 container create c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 23 05:03:14 localhost systemd[1]: Started libpod-conmon-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope. Nov 23 05:03:14 localhost podman[328188]: 2025-11-23 10:03:14.08311636 +0000 UTC m=+0.047112111 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:14 localhost systemd[1]: Started libcrun container. Nov 23 05:03:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aafddc39c2a04befbf58d4bf8d8ebb00098dbf032214687c1536c4fc26c16c3a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:14 localhost podman[328188]: 2025-11-23 10:03:14.204283554 +0000 UTC m=+0.168279295 container init c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:14 localhost podman[328188]: 2025-11-23 10:03:14.215134818 +0000 UTC m=+0.179130569 container start c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:03:14 localhost dnsmasq[328206]: started, version 2.85 cachesize 150 Nov 23 05:03:14 localhost dnsmasq[328206]: DNS service limited to local subnets Nov 23 05:03:14 localhost dnsmasq[328206]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:14 localhost dnsmasq[328206]: warning: no upstream servers configured Nov 23 05:03:14 localhost dnsmasq-dhcp[328206]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:14 localhost dnsmasq[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/addn_hosts - 0 addresses Nov 23 05:03:14 localhost dnsmasq-dhcp[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/host Nov 23 05:03:14 localhost dnsmasq-dhcp[328206]: read /var/lib/neutron/dhcp/61421cbd-202e-4179-bacd-7ac7c4ff8280/opts Nov 23 05:03:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:14.290 263258 INFO neutron.agent.dhcp.agent [None req-35536409-a725-405e-a2a6-15bf8d52ff9c - - - - - -] DHCP configuration for ports {'71598bfa-e4b2-4566-9a0f-19c8303297a9'} is completed#033[00m Nov 23 05:03:14 localhost dnsmasq[328206]: exiting on receipt of SIGTERM Nov 23 05:03:14 localhost systemd[1]: libpod-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope: Deactivated successfully. Nov 23 05:03:14 localhost podman[328225]: 2025-11-23 10:03:14.50119632 +0000 UTC m=+0.060776403 container kill c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 23 05:03:14 localhost podman[328237]: 2025-11-23 10:03:14.559255458 +0000 UTC m=+0.045541284 container died c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:14 localhost podman[328237]: 2025-11-23 10:03:14.58397622 +0000 UTC m=+0.070262046 container cleanup c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:03:14 localhost systemd[1]: libpod-conmon-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08.scope: Deactivated successfully. Nov 23 05:03:14 localhost podman[328244]: 2025-11-23 10:03:14.63006122 +0000 UTC m=+0.105011876 container remove c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-61421cbd-202e-4179-bacd-7ac7c4ff8280, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:14 localhost ovn_controller[154788]: 2025-11-23T10:03:14Z|00427|binding|INFO|Releasing lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 from this chassis (sb_readonly=0) Nov 23 05:03:14 localhost ovn_controller[154788]: 2025-11-23T10:03:14Z|00428|binding|INFO|Setting lport 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 down in Southbound Nov 23 05:03:14 localhost kernel: device tap7b7efa3a-46 left promiscuous mode Nov 23 05:03:14 localhost nova_compute[281952]: 2025-11-23 10:03:14.642 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:14.650 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61421cbd-202e-4179-bacd-7ac7c4ff8280', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed19edd7-37c8-4522-837f-b32faf388727, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:14.652 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 7b7efa3a-46b8-40b3-a3b6-e4b3a1c76e36 in datapath 61421cbd-202e-4179-bacd-7ac7c4ff8280 unbound from our chassis#033[00m Nov 23 05:03:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:14.653 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 61421cbd-202e-4179-bacd-7ac7c4ff8280 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:14.654 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1a415630-1979-4cf9-b221-f4d67b7fb6a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:14 localhost nova_compute[281952]: 2025-11-23 10:03:14.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.094 263258 INFO neutron.agent.dhcp.agent [None req-7c4593f8-5b68-45ff-88c2-d65b160f6202 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.095 263258 INFO neutron.agent.dhcp.agent [None req-7c4593f8-5b68-45ff-88c2-d65b160f6202 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:15 localhost systemd[1]: var-lib-containers-storage-overlay-aafddc39c2a04befbf58d4bf8d8ebb00098dbf032214687c1536c4fc26c16c3a-merged.mount: Deactivated successfully. Nov 23 05:03:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c123c68c7204d81ae69561f577bceab2fe540b0536ef96c87c7196e34da81d08-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:15 localhost systemd[1]: run-netns-qdhcp\x2d61421cbd\x2d202e\x2d4179\x2dbacd\x2d7ac7c4ff8280.mount: Deactivated successfully. Nov 23 05:03:15 localhost nova_compute[281952]: 2025-11-23 10:03:15.219 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:15 localhost nova_compute[281952]: 2025-11-23 10:03:15.220 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:15.280 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:16 localhost ovn_controller[154788]: 2025-11-23T10:03:16Z|00429|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.052 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.295 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.295 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.296 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.297 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:03:16 localhost nova_compute[281952]: 2025-11-23 10:03:16.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:03:17 localhost systemd[1]: tmp-crun.xXndTU.mount: Deactivated successfully. Nov 23 05:03:17 localhost podman[328269]: 2025-11-23 10:03:17.05822939 +0000 UTC m=+0.098014550 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:17 localhost podman[328270]: 2025-11-23 10:03:17.099289946 +0000 UTC m=+0.136737054 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64) Nov 23 05:03:17 localhost podman[328269]: 2025-11-23 10:03:17.101459753 +0000 UTC m=+0.141244923 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 23 05:03:17 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:03:17 localhost podman[328268]: 2025-11-23 10:03:17.129679072 +0000 UTC m=+0.173023201 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller) Nov 23 05:03:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e144 e144: 6 total, 6 up, 6 in Nov 23 05:03:17 localhost podman[328270]: 2025-11-23 10:03:17.182307593 +0000 UTC m=+0.219754711 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9) Nov 23 05:03:17 localhost podman[328268]: 2025-11-23 10:03:17.197737358 +0000 UTC m=+0.241081507 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 05:03:17 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:03:17 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.248 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.267 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.268 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.269 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.269 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:03:17 localhost nova_compute[281952]: 2025-11-23 10:03:17.418 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e145 e145: 6 total, 6 up, 6 in Nov 23 05:03:18 localhost dnsmasq[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/addn_hosts - 0 addresses Nov 23 05:03:18 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/host Nov 23 05:03:18 localhost podman[328349]: 2025-11-23 10:03:18.016625285 +0000 UTC m=+0.066933543 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:03:18 localhost dnsmasq-dhcp[324132]: read /var/lib/neutron/dhcp/31b977a7-a37c-42ba-bed9-7b22959f6310/opts Nov 23 05:03:18 localhost nova_compute[281952]: 2025-11-23 10:03:18.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:18 localhost ovn_controller[154788]: 2025-11-23T10:03:18Z|00430|binding|INFO|Releasing lport ed2c180f-1a61-4a88-a761-adcb953abd22 from this chassis (sb_readonly=0) Nov 23 05:03:18 localhost ovn_controller[154788]: 2025-11-23T10:03:18Z|00431|binding|INFO|Setting lport ed2c180f-1a61-4a88-a761-adcb953abd22 down in Southbound Nov 23 05:03:18 localhost kernel: device taped2c180f-1a left promiscuous mode Nov 23 05:03:18 localhost nova_compute[281952]: 2025-11-23 10:03:18.350 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:18.356 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31b977a7-a37c-42ba-bed9-7b22959f6310', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10f8dd7c838246c58f1d2c4efc771237', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd8c251-3bb7-4e15-9c8d-7fcd2e804fa5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ed2c180f-1a61-4a88-a761-adcb953abd22) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:18.358 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ed2c180f-1a61-4a88-a761-adcb953abd22 in datapath 31b977a7-a37c-42ba-bed9-7b22959f6310 unbound from our chassis#033[00m Nov 23 05:03:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:18.361 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31b977a7-a37c-42ba-bed9-7b22959f6310, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:03:18 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:18.362 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c15fa6b3-093e-4f59-8dd8-eb701a9d9dca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:18 localhost nova_compute[281952]: 2025-11-23 10:03:18.371 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:19 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:19.367 263258 INFO neutron.agent.linux.ip_lib [None req-8153408d-0707-4f00-908f-cc93975263e6 - - - - - -] Device tapa62eb638-20 cannot be used as it has no MAC address#033[00m Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.429 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost kernel: device tapa62eb638-20 entered promiscuous mode Nov 23 05:03:19 localhost NetworkManager[5975]: [1763892199.4361] manager: (tapa62eb638-20): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Nov 23 05:03:19 localhost systemd-udevd[328383]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:19 localhost ovn_controller[154788]: 2025-11-23T10:03:19Z|00432|binding|INFO|Claiming lport a62eb638-209a-4f5d-90e4-610cfcccbc39 for this chassis. Nov 23 05:03:19 localhost ovn_controller[154788]: 2025-11-23T10:03:19Z|00433|binding|INFO|a62eb638-209a-4f5d-90e4-610cfcccbc39: Claiming unknown Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.439 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:19.450 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c2e626d-1a69-43ac-9d65-b501a9547aeb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a62eb638-209a-4f5d-90e4-610cfcccbc39) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:19.452 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a62eb638-209a-4f5d-90e4-610cfcccbc39 in datapath 326109dc-7744-4dc1-8604-7d25ed028442 bound to our chassis#033[00m Nov 23 05:03:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:19.454 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 326109dc-7744-4dc1-8604-7d25ed028442 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:19.457 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cecd540f-8f8d-462b-81e2-1dc077c4be17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:19 localhost ovn_controller[154788]: 2025-11-23T10:03:19Z|00434|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 ovn-installed in OVS Nov 23 05:03:19 localhost ovn_controller[154788]: 2025-11-23T10:03:19Z|00435|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 up in Southbound Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.471 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.698 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.727 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.764 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:19 localhost nova_compute[281952]: 2025-11-23 10:03:19.796 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:03:20 localhost podman[328457]: Nov 23 05:03:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:03:20 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2320941116' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:03:20 localhost podman[328457]: 2025-11-23 10:03:20.645690244 +0000 UTC m=+0.086829316 container create 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.657 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:03:20 localhost systemd[1]: Started libpod-conmon-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope. Nov 23 05:03:20 localhost systemd[1]: Started libcrun container. Nov 23 05:03:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/285ef1a63d76dfccf3c353674c61e591f2fc967ae03a06c927284b817a3c25d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:20 localhost podman[328457]: 2025-11-23 10:03:20.604193806 +0000 UTC m=+0.045332898 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:20 localhost podman[328457]: 2025-11-23 10:03:20.705523787 +0000 UTC m=+0.146662859 container init 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:20 localhost podman[328457]: 2025-11-23 10:03:20.711504841 +0000 UTC m=+0.152643903 container start 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.714 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.715 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:03:20 localhost dnsmasq[328477]: started, version 2.85 cachesize 150 Nov 23 05:03:20 localhost dnsmasq[328477]: DNS service limited to local subnets Nov 23 05:03:20 localhost dnsmasq[328477]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:20 localhost dnsmasq[328477]: warning: no upstream servers configured Nov 23 05:03:20 localhost dnsmasq-dhcp[328477]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Nov 23 05:03:20 localhost dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 0 addresses Nov 23 05:03:20 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host Nov 23 05:03:20 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts Nov 23 05:03:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:20.912 263258 INFO neutron.agent.dhcp.agent [None req-aa4c81ee-b19e-4620-ac08-099c55e0bd42 - - - - - -] DHCP configuration for ports {'a7428151-2964-46ed-a65c-aea518d3a1f3'} is completed#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.920 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.921 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11120MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.922 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:03:20 localhost nova_compute[281952]: 2025-11-23 10:03:20.922 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.085 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.461 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.877 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:21 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:03:21 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/585599417' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.948 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.954 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.980 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.983 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:03:21 localhost nova_compute[281952]: 2025-11-23 10:03:21.984 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.061s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:03:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e146 e146: 6 total, 6 up, 6 in Nov 23 05:03:22 localhost nova_compute[281952]: 2025-11-23 10:03:22.985 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:22 localhost nova_compute[281952]: 2025-11-23 10:03:22.987 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:23 localhost ovn_controller[154788]: 2025-11-23T10:03:23Z|00436|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:23 localhost nova_compute[281952]: 2025-11-23 10:03:23.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:23 localhost dnsmasq[324132]: exiting on receipt of SIGTERM Nov 23 05:03:23 localhost podman[328515]: 2025-11-23 10:03:23.740025478 +0000 UTC m=+0.069785331 container kill c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:03:23 localhost systemd[1]: libpod-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope: Deactivated successfully. Nov 23 05:03:23 localhost podman[328529]: 2025-11-23 10:03:23.807392043 +0000 UTC m=+0.056244174 container died c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:03:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:23 localhost podman[328529]: 2025-11-23 10:03:23.893112393 +0000 UTC m=+0.141964494 container cleanup c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:23 localhost systemd[1]: libpod-conmon-c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99.scope: Deactivated successfully. Nov 23 05:03:23 localhost podman[328536]: 2025-11-23 10:03:23.918933308 +0000 UTC m=+0.155945624 container remove c71824f3789f403cfa43a3c562ecb3190277ee306f62db03ce06f264d963ed99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31b977a7-a37c-42ba-bed9-7b22959f6310, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:03:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:23.947 263258 INFO neutron.agent.dhcp.agent [None req-ceeaad65-fd22-40bb-856c-19912c25a24c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:23.948 263258 INFO neutron.agent.dhcp.agent [None req-ceeaad65-fd22-40bb-856c-19912c25a24c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:24.339 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:23Z, description=, device_id=546a2558-70b9-4cc7-8d44-6c3be7bdf264, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5a6c76dd-6c35-476b-8f16-632b02edc10d, ip_allocation=immediate, mac_address=fa:16:3e:61:b9:6b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:15Z, description=, dns_domain=, id=326109dc-7744-4dc1-8604-7d25ed028442, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-3825805, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2208, status=ACTIVE, subnets=['73db731d-a2f7-4dcc-a56f-d6dcfd7e2ce1'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:18Z, vlan_transparent=None, network_id=326109dc-7744-4dc1-8604-7d25ed028442, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2245, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:24Z on network 326109dc-7744-4dc1-8604-7d25ed028442#033[00m Nov 23 05:03:24 localhost dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 1 addresses Nov 23 05:03:24 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host Nov 23 05:03:24 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts Nov 23 05:03:24 localhost podman[328577]: 2025-11-23 10:03:24.555642723 +0000 UTC m=+0.060899537 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:03:24 localhost systemd[1]: var-lib-containers-storage-overlay-d30fb1fb9da3ec81fc09dcff1705485fc67933914b93a576884d8b0f75a2cf79-merged.mount: Deactivated successfully. Nov 23 05:03:24 localhost systemd[1]: run-netns-qdhcp\x2d31b977a7\x2da37c\x2d42ba\x2dbed9\x2d7b22959f6310.mount: Deactivated successfully. Nov 23 05:03:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:24.908 263258 INFO neutron.agent.dhcp.agent [None req-a2f5c79d-2039-450f-9c98-aabc386843ee - - - - - -] DHCP configuration for ports {'5a6c76dd-6c35-476b-8f16-632b02edc10d'} is completed#033[00m Nov 23 05:03:25 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:25.021 2 INFO neutron.agent.securitygroups_rpc [None req-82f52ebc-5307-4b75-9da0-dca3e27d739d da47bb8e9ce044b7a6c60aeaa303445e 1ed74022d4944d5c8276b163cae1a73a - - default default] Security group member updated ['0c7393ad-63e1-4b57-bb16-ccf2466506e9']#033[00m Nov 23 05:03:25 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:25.906 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:23Z, description=, device_id=546a2558-70b9-4cc7-8d44-6c3be7bdf264, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5a6c76dd-6c35-476b-8f16-632b02edc10d, ip_allocation=immediate, mac_address=fa:16:3e:61:b9:6b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:15Z, description=, dns_domain=, id=326109dc-7744-4dc1-8604-7d25ed028442, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-3825805, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2208, status=ACTIVE, subnets=['73db731d-a2f7-4dcc-a56f-d6dcfd7e2ce1'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:18Z, vlan_transparent=None, network_id=326109dc-7744-4dc1-8604-7d25ed028442, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2245, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:24Z on network 326109dc-7744-4dc1-8604-7d25ed028442#033[00m Nov 23 05:03:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:03:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:03:26 localhost podman[328599]: 2025-11-23 10:03:26.036352406 +0000 UTC m=+0.085806653 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:03:26 localhost podman[328599]: 2025-11-23 10:03:26.050300666 +0000 UTC m=+0.099754873 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3) Nov 23 05:03:26 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:03:26 localhost systemd[1]: tmp-crun.GOqwZs.mount: Deactivated successfully. Nov 23 05:03:26 localhost nova_compute[281952]: 2025-11-23 10:03:26.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:26 localhost nova_compute[281952]: 2025-11-23 10:03:26.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 05:03:26 localhost dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 1 addresses Nov 23 05:03:26 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host Nov 23 05:03:26 localhost podman[328643]: 2025-11-23 10:03:26.219227481 +0000 UTC m=+0.079030417 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:26 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts Nov 23 05:03:26 localhost podman[328601]: 2025-11-23 10:03:26.218647822 +0000 UTC m=+0.266028506 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:03:26 localhost podman[328601]: 2025-11-23 10:03:26.231279671 +0000 UTC m=+0.278660335 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:03:26 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:03:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 05:03:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/693879309' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 05:03:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:26.494 263258 INFO neutron.agent.dhcp.agent [None req-7f8d0217-ed26-446d-9f9a-63e3cbe5de4e - - - - - -] DHCP configuration for ports {'5a6c76dd-6c35-476b-8f16-632b02edc10d'} is completed#033[00m Nov 23 05:03:26 localhost nova_compute[281952]: 2025-11-23 10:03:26.930 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:27.023 263258 INFO neutron.agent.linux.ip_lib [None req-726027db-25ed-4cd0-bd23-442d668dd27b - - - - - -] Device tap8e54ef2c-e4 cannot be used as it has no MAC address#033[00m Nov 23 05:03:27 localhost systemd[1]: tmp-crun.47e1Ro.mount: Deactivated successfully. Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.046 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost kernel: device tap8e54ef2c-e4 entered promiscuous mode Nov 23 05:03:27 localhost NetworkManager[5975]: [1763892207.0553] manager: (tap8e54ef2c-e4): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00437|binding|INFO|Claiming lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 for this chassis. Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00438|binding|INFO|8e54ef2c-e414-469a-ae06-6d0022eb93e2: Claiming unknown Nov 23 05:03:27 localhost systemd-udevd[328688]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.070 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a00f0c05-4895-49ad-a79a-15697f3729df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8e54ef2c-e414-469a-ae06-6d0022eb93e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.071 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8e54ef2c-e414-469a-ae06-6d0022eb93e2 in datapath 42acb812-12e3-44f8-b0aa-aebaa30b36a2 bound to our chassis#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.072 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42acb812-12e3-44f8-b0aa-aebaa30b36a2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.072 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8f3ce5-5491-4059-9645-c5c0f1226a42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00439|binding|INFO|Setting lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 ovn-installed in OVS Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00440|binding|INFO|Setting lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 up in Southbound Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.100 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost journal[230249]: ethtool ioctl error on tap8e54ef2c-e4: No such device Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.150 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e147 e147: 6 total, 6 up, 6 in Nov 23 05:03:27 localhost dnsmasq[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/addn_hosts - 0 addresses Nov 23 05:03:27 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/host Nov 23 05:03:27 localhost dnsmasq-dhcp[328477]: read /var/lib/neutron/dhcp/326109dc-7744-4dc1-8604-7d25ed028442/opts Nov 23 05:03:27 localhost podman[328744]: 2025-11-23 10:03:27.572823909 +0000 UTC m=+0.070268536 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:27 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:27.694 2 INFO neutron.agent.securitygroups_rpc [None req-746b6928-7309-4035-bfd7-9d9f95de5728 da47bb8e9ce044b7a6c60aeaa303445e 1ed74022d4944d5c8276b163cae1a73a - - default default] Security group member updated ['0c7393ad-63e1-4b57-bb16-ccf2466506e9']#033[00m Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.790 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:27 localhost kernel: device tapa62eb638-20 left promiscuous mode Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00441|binding|INFO|Releasing lport a62eb638-209a-4f5d-90e4-610cfcccbc39 from this chassis (sb_readonly=0) Nov 23 05:03:27 localhost ovn_controller[154788]: 2025-11-23T10:03:27Z|00442|binding|INFO|Setting lport a62eb638-209a-4f5d-90e4-610cfcccbc39 down in Southbound Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.806 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-326109dc-7744-4dc1-8604-7d25ed028442', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c2e626d-1a69-43ac-9d65-b501a9547aeb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a62eb638-209a-4f5d-90e4-610cfcccbc39) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.808 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a62eb638-209a-4f5d-90e4-610cfcccbc39 in datapath 326109dc-7744-4dc1-8604-7d25ed028442 unbound from our chassis#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.810 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 326109dc-7744-4dc1-8604-7d25ed028442 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:27 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:27.812 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2e70eb-6cd2-482d-a5b0-978271eb622a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:27 localhost nova_compute[281952]: 2025-11-23 10:03:27.818 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost sshd[328789]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:03:28 localhost podman[328795]: Nov 23 05:03:28 localhost podman[328795]: 2025-11-23 10:03:28.116351252 +0000 UTC m=+0.090623463 container create 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:28 localhost podman[328795]: 2025-11-23 10:03:28.067700333 +0000 UTC m=+0.041972544 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:28 localhost systemd[1]: Started libpod-conmon-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope. Nov 23 05:03:28 localhost systemd[1]: Started libcrun container. Nov 23 05:03:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c870a93f87fe203152ea7cfa7a09e2c00cb9ef905fa83f8a1c3bfd9d8722c30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:28 localhost podman[328795]: 2025-11-23 10:03:28.206249552 +0000 UTC m=+0.180521743 container init 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:28 localhost podman[328795]: 2025-11-23 10:03:28.215349352 +0000 UTC m=+0.189621573 container start 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:28 localhost dnsmasq[328815]: started, version 2.85 cachesize 150 Nov 23 05:03:28 localhost dnsmasq[328815]: DNS service limited to local subnets Nov 23 05:03:28 localhost dnsmasq[328815]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:28 localhost dnsmasq[328815]: warning: no upstream servers configured Nov 23 05:03:28 localhost dnsmasq-dhcp[328815]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:28 localhost dnsmasq[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/addn_hosts - 0 addresses Nov 23 05:03:28 localhost dnsmasq-dhcp[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/host Nov 23 05:03:28 localhost dnsmasq-dhcp[328815]: read /var/lib/neutron/dhcp/42acb812-12e3-44f8-b0aa-aebaa30b36a2/opts Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.246 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:28 localhost ovn_controller[154788]: 2025-11-23T10:03:28Z|00443|binding|INFO|Removing iface tap8e54ef2c-e4 ovn-installed in OVS Nov 23 05:03:28 localhost ovn_controller[154788]: 2025-11-23T10:03:28Z|00444|binding|INFO|Removing lport 8e54ef2c-e414-469a-ae06-6d0022eb93e2 ovn-installed in OVS Nov 23 05:03:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:28.341 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0b0ac5e7-076f-4df3-8266-dbc49db160ed with type ""#033[00m Nov 23 05:03:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:28.343 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42acb812-12e3-44f8-b0aa-aebaa30b36a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a00f0c05-4895-49ad-a79a-15697f3729df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8e54ef2c-e414-469a-ae06-6d0022eb93e2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:28.345 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8e54ef2c-e414-469a-ae06-6d0022eb93e2 in datapath 42acb812-12e3-44f8-b0aa-aebaa30b36a2 unbound from our chassis#033[00m Nov 23 05:03:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:28.347 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42acb812-12e3-44f8-b0aa-aebaa30b36a2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:28 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:28.348 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[09156e62-825e-430c-8125-86f2a991fd25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.388 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.388 263258 INFO neutron.agent.dhcp.agent [None req-df6d5d6d-5c4f-4daf-9db9-e2ab766fdfbc - - - - - -] DHCP configuration for ports {'5fe96874-43e3-416b-848d-729ed5b3ad15'} is completed#033[00m Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:28 localhost dnsmasq[328815]: exiting on receipt of SIGTERM Nov 23 05:03:28 localhost podman[328834]: 2025-11-23 10:03:28.523502734 +0000 UTC m=+0.048786163 container kill 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:03:28 localhost systemd[1]: libpod-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope: Deactivated successfully. Nov 23 05:03:28 localhost podman[328848]: 2025-11-23 10:03:28.586699702 +0000 UTC m=+0.052160819 container died 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:03:28 localhost ovn_controller[154788]: 2025-11-23T10:03:28Z|00445|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:28 localhost podman[328848]: 2025-11-23 10:03:28.629521641 +0000 UTC m=+0.094982738 container cleanup 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:28 localhost systemd[1]: libpod-conmon-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575.scope: Deactivated successfully. Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.649 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost podman[328850]: 2025-11-23 10:03:28.673473515 +0000 UTC m=+0.130788440 container remove 3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42acb812-12e3-44f8-b0aa-aebaa30b36a2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost kernel: device tap8e54ef2c-e4 left promiscuous mode Nov 23 05:03:28 localhost nova_compute[281952]: 2025-11-23 10:03:28.699 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.725 263258 INFO neutron.agent.dhcp.agent [None req-50134160-1f00-41cb-b8cd-7f78693adfc6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:28.726 263258 INFO neutron.agent.dhcp.agent [None req-50134160-1f00-41cb-b8cd-7f78693adfc6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e148 e148: 6 total, 6 up, 6 in Nov 23 05:03:29 localhost systemd[1]: var-lib-containers-storage-overlay-3c870a93f87fe203152ea7cfa7a09e2c00cb9ef905fa83f8a1c3bfd9d8722c30-merged.mount: Deactivated successfully. Nov 23 05:03:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3be48e45b6b7059bfc9abc12b4eb1f009e3cfd82f17f5df89c27aec9509e9575-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:29 localhost systemd[1]: run-netns-qdhcp\x2d42acb812\x2d12e3\x2d44f8\x2db0aa\x2daebaa30b36a2.mount: Deactivated successfully. Nov 23 05:03:29 localhost openstack_network_exporter[242668]: ERROR 10:03:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:03:29 localhost openstack_network_exporter[242668]: ERROR 10:03:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:03:29 localhost openstack_network_exporter[242668]: ERROR 10:03:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:03:29 localhost openstack_network_exporter[242668]: ERROR 10:03:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:03:29 localhost openstack_network_exporter[242668]: Nov 23 05:03:29 localhost openstack_network_exporter[242668]: ERROR 10:03:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:03:29 localhost openstack_network_exporter[242668]: Nov 23 05:03:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e149 e149: 6 total, 6 up, 6 in Nov 23 05:03:30 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:30.184 2 INFO neutron.agent.securitygroups_rpc [None req-7bdd1f21-8588-4120-93d3-3c530b607701 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:30 localhost dnsmasq[328477]: exiting on receipt of SIGTERM Nov 23 05:03:30 localhost podman[328892]: 2025-11-23 10:03:30.382747895 +0000 UTC m=+0.061001889 container kill 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:03:30 localhost systemd[1]: libpod-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope: Deactivated successfully. Nov 23 05:03:30 localhost podman[328906]: 2025-11-23 10:03:30.458031342 +0000 UTC m=+0.062473983 container died 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:03:30 localhost systemd[1]: tmp-crun.YZ8ytI.mount: Deactivated successfully. Nov 23 05:03:30 localhost podman[328906]: 2025-11-23 10:03:30.50213625 +0000 UTC m=+0.106578831 container cleanup 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:03:30 localhost systemd[1]: libpod-conmon-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f.scope: Deactivated successfully. Nov 23 05:03:30 localhost podman[328908]: 2025-11-23 10:03:30.534826835 +0000 UTC m=+0.131001797 container remove 0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-326109dc-7744-4dc1-8604-7d25ed028442, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:03:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:30.756 263258 INFO neutron.agent.dhcp.agent [None req-d52a2512-637f-4eeb-a6ef-d124ca0bc07f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:30.888 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e150 e150: 6 total, 6 up, 6 in Nov 23 05:03:31 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:31.149 2 INFO neutron.agent.securitygroups_rpc [None req-04f8e66d-7c6f-40a2-b025-3f4a41cc4961 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 05:03:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:31.226 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.228 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 05:03:31 localhost systemd[1]: var-lib-containers-storage-overlay-285ef1a63d76dfccf3c353674c61e591f2fc967ae03a06c927284b817a3c25d2-merged.mount: Deactivated successfully. Nov 23 05:03:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e88009074d231aa8a6398cc6262191b30be5301ea715e68c3c64c9f3f095c1f-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:31 localhost systemd[1]: run-netns-qdhcp\x2d326109dc\x2d7744\x2d4dc1\x2d8604\x2d7d25ed028442.mount: Deactivated successfully. Nov 23 05:03:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:31.549 263258 INFO neutron.agent.linux.ip_lib [None req-77f79bd3-900d-4c34-97e6-e52b9371395a - - - - - -] Device tap1874bdbc-c6 cannot be used as it has no MAC address#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:31 localhost kernel: device tap1874bdbc-c6 entered promiscuous mode Nov 23 05:03:31 localhost NetworkManager[5975]: [1763892211.5786] manager: (tap1874bdbc-c6): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.578 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:31 localhost ovn_controller[154788]: 2025-11-23T10:03:31Z|00446|binding|INFO|Claiming lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 for this chassis. Nov 23 05:03:31 localhost ovn_controller[154788]: 2025-11-23T10:03:31Z|00447|binding|INFO|1874bdbc-c65e-41bc-8fc7-89c003c8f6e9: Claiming unknown Nov 23 05:03:31 localhost systemd-udevd[328943]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:31.594 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34ed355d-aacf-40b7-85ca-d489cba8c831, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1874bdbc-c65e-41bc-8fc7-89c003c8f6e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:31.596 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 in datapath 42ad5b5e-db5f-4e5a-846c-652f436fa783 bound to our chassis#033[00m Nov 23 05:03:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:31.597 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42ad5b5e-db5f-4e5a-846c-652f436fa783 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:31.599 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4d5be3-169a-4ccc-a86f-46d94ad8b1e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost ovn_controller[154788]: 2025-11-23T10:03:31Z|00448|binding|INFO|Setting lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 ovn-installed in OVS Nov 23 05:03:31 localhost ovn_controller[154788]: 2025-11-23T10:03:31Z|00449|binding|INFO|Setting lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 up in Southbound Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.615 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost journal[230249]: ethtool ioctl error on tap1874bdbc-c6: No such device Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.658 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:31 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:31.859 2 INFO neutron.agent.securitygroups_rpc [None req-493a4cea-33ac-4b44-8c12-ef9399b27484 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:31 localhost nova_compute[281952]: 2025-11-23 10:03:31.932 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:32 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:32.401 2 INFO neutron.agent.securitygroups_rpc [None req-88038149-5d7c-44ef-b630-13d9beb8c240 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:32 localhost podman[329012]: Nov 23 05:03:32 localhost podman[329012]: 2025-11-23 10:03:32.559013451 +0000 UTC m=+0.060358140 container create c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:32 localhost systemd[1]: Started libpod-conmon-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope. Nov 23 05:03:32 localhost systemd[1]: Started libcrun container. Nov 23 05:03:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/669ca07686ba0518e574dcb34d9d6983765cc1732dd82e1f456d50492dd71e6f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:32 localhost podman[329012]: 2025-11-23 10:03:32.622702649 +0000 UTC m=+0.124047378 container init c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:03:32 localhost podman[329012]: 2025-11-23 10:03:32.523761309 +0000 UTC m=+0.025106048 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:32 localhost systemd[1]: tmp-crun.TAhm4w.mount: Deactivated successfully. Nov 23 05:03:32 localhost podman[329012]: 2025-11-23 10:03:32.635566677 +0000 UTC m=+0.136911376 container start c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:03:32 localhost dnsmasq[329030]: started, version 2.85 cachesize 150 Nov 23 05:03:32 localhost dnsmasq[329030]: DNS service limited to local subnets Nov 23 05:03:32 localhost dnsmasq[329030]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:32 localhost dnsmasq[329030]: warning: no upstream servers configured Nov 23 05:03:32 localhost dnsmasq-dhcp[329030]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:32 localhost dnsmasq[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/addn_hosts - 0 addresses Nov 23 05:03:32 localhost dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/host Nov 23 05:03:32 localhost dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/opts Nov 23 05:03:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:32.796 263258 INFO neutron.agent.dhcp.agent [None req-9344b75d-950f-4844-938c-892c83b2be3c - - - - - -] DHCP configuration for ports {'62366403-2af7-4425-bdc9-dc79db3ff768'} is completed#033[00m Nov 23 05:03:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e151 e151: 6 total, 6 up, 6 in Nov 23 05:03:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:33 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:33.429 2 INFO neutron.agent.securitygroups_rpc [None req-6155fed5-7bb5-4f1e-ab16-bb23a0937b77 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:34 localhost ovn_controller[154788]: 2025-11-23T10:03:34Z|00450|binding|INFO|Removing iface tap1874bdbc-c6 ovn-installed in OVS Nov 23 05:03:34 localhost ovn_controller[154788]: 2025-11-23T10:03:34Z|00451|binding|INFO|Removing lport 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 ovn-installed in OVS Nov 23 05:03:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:34.046 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 1468a584-8a33-4043-9a71-a0ffcebeb0a0 with type ""#033[00m Nov 23 05:03:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:34.047 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42ad5b5e-db5f-4e5a-846c-652f436fa783', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34ed355d-aacf-40b7-85ca-d489cba8c831, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1874bdbc-c65e-41bc-8fc7-89c003c8f6e9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:34 localhost nova_compute[281952]: 2025-11-23 10:03:34.048 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:34.050 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1874bdbc-c65e-41bc-8fc7-89c003c8f6e9 in datapath 42ad5b5e-db5f-4e5a-846c-652f436fa783 unbound from our chassis#033[00m Nov 23 05:03:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:34.052 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42ad5b5e-db5f-4e5a-846c-652f436fa783, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:03:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:34.053 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[1fcb7785-8213-479e-b957-6762c5c5eb09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:34 localhost kernel: device tap1874bdbc-c6 left promiscuous mode Nov 23 05:03:34 localhost nova_compute[281952]: 2025-11-23 10:03:34.055 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:34 localhost nova_compute[281952]: 2025-11-23 10:03:34.076 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e152 e152: 6 total, 6 up, 6 in Nov 23 05:03:34 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:34.755 2 INFO neutron.agent.securitygroups_rpc [None req-0eb2e2be-3d24-4ace-a008-a7d7e29c4328 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e153 e153: 6 total, 6 up, 6 in Nov 23 05:03:35 localhost podman[329048]: 2025-11-23 10:03:35.213503301 +0000 UTC m=+0.081633820 container kill c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:03:35 localhost dnsmasq[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/addn_hosts - 0 addresses Nov 23 05:03:35 localhost dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/host Nov 23 05:03:35 localhost dnsmasq-dhcp[329030]: read /var/lib/neutron/dhcp/42ad5b5e-db5f-4e5a-846c-652f436fa783/opts Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent [None req-c69f5f2a-9d3e-46d9-ba87-e0af38de0c5b - - - - - -] Unable to reload_allocations dhcp for 42ad5b5e-db5f-4e5a-846c-652f436fa783.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1874bdbc-c6 not found in namespace qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783. Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent return fut.result() Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent raise self._exception Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap1874bdbc-c6 not found in namespace qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783. Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.251 263258 ERROR neutron.agent.dhcp.agent #033[00m Nov 23 05:03:35 localhost sshd[329062]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:03:35 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:35.454 263258 INFO neutron.agent.linux.ip_lib [None req-db6700fe-b5a8-400d-8f63-8836ad2ec7a0 - - - - - -] Device tapf441590c-f0 cannot be used as it has no MAC address#033[00m Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.487 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:35 localhost kernel: device tapf441590c-f0 entered promiscuous mode Nov 23 05:03:35 localhost NetworkManager[5975]: [1763892215.4962] manager: (tapf441590c-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.498 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:35 localhost ovn_controller[154788]: 2025-11-23T10:03:35Z|00452|binding|INFO|Claiming lport f441590c-f097-4418-b5c6-fc684c4c990b for this chassis. Nov 23 05:03:35 localhost ovn_controller[154788]: 2025-11-23T10:03:35Z|00453|binding|INFO|f441590c-f097-4418-b5c6-fc684c4c990b: Claiming unknown Nov 23 05:03:35 localhost systemd-udevd[329073]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:35 localhost ovn_controller[154788]: 2025-11-23T10:03:35Z|00454|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b ovn-installed in OVS Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.528 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost ovn_controller[154788]: 2025-11-23T10:03:35Z|00455|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b up in Southbound Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:35.558 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe156e5f-7616-4580-b024-66548b9558a4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f441590c-f097-4418-b5c6-fc684c4c990b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:35.560 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f441590c-f097-4418-b5c6-fc684c4c990b in datapath 4a43af34-e77a-4d75-9a08-a9727e7ca345 bound to our chassis#033[00m Nov 23 05:03:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:35.561 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4a43af34-e77a-4d75-9a08-a9727e7ca345 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:35.563 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a48634-c821-481e-8e85-f6ae0d93207a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:35 localhost journal[230249]: ethtool ioctl error on tapf441590c-f0: No such device Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.604 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:35 localhost ovn_controller[154788]: 2025-11-23T10:03:35Z|00456|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:35 localhost nova_compute[281952]: 2025-11-23 10:03:35.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:03:36 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:03:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:03:36 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3526128852' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:03:36 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:36.195 2 INFO neutron.agent.securitygroups_rpc [None req-a71f4ada-2ee4-4e29-a05b-0c137a49cc85 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:36 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:36.446 2 INFO neutron.agent.securitygroups_rpc [None req-606c84ff-d0b3-4da5-8d08-f4db462a1bb4 a8a12d646f734219a5736bd9a89106d3 cd27ceae55c44d478998092e7554fd8a - - default default] Security group member updated ['57d92d06-0a9a-469b-b69f-4fb9e6e560cf']#033[00m Nov 23 05:03:36 localhost podman[329145]: Nov 23 05:03:36 localhost podman[329145]: 2025-11-23 10:03:36.581632274 +0000 UTC m=+0.099296402 container create 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:03:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:03:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:03:36 localhost systemd[1]: Started libpod-conmon-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope. Nov 23 05:03:36 localhost podman[329145]: 2025-11-23 10:03:36.533350379 +0000 UTC m=+0.051014587 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:36 localhost systemd[1]: Started libcrun container. Nov 23 05:03:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efc00143707c6c7059c45fc469c5c19df6826f798c48ffab4fae84399025aa28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:36 localhost podman[329145]: 2025-11-23 10:03:36.685320048 +0000 UTC m=+0.202984176 container init 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:03:36 localhost dnsmasq[329187]: started, version 2.85 cachesize 150 Nov 23 05:03:36 localhost dnsmasq[329187]: DNS service limited to local subnets Nov 23 05:03:36 localhost dnsmasq[329187]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:36 localhost dnsmasq[329187]: warning: no upstream servers configured Nov 23 05:03:36 localhost dnsmasq-dhcp[329187]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:03:36 localhost dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 0 addresses Nov 23 05:03:36 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host Nov 23 05:03:36 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts Nov 23 05:03:36 localhost podman[329159]: 2025-11-23 10:03:36.735964483 +0000 UTC m=+0.105609862 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:03:36 localhost podman[329145]: 2025-11-23 10:03:36.745736277 +0000 UTC m=+0.263400405 container start 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:03:36 localhost podman[329159]: 2025-11-23 10:03:36.776343479 +0000 UTC m=+0.145988878 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:03:36 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:03:36 localhost podman[329160]: 2025-11-23 10:03:36.798186837 +0000 UTC m=+0.162954099 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm) Nov 23 05:03:36 localhost podman[329160]: 2025-11-23 10:03:36.813367984 +0000 UTC m=+0.178135256 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:36 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:03:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:36.832 263258 INFO neutron.agent.dhcp.agent [None req-0bff647a-37c2-4421-8ce0-5d475f777c74 - - - - - -] Synchronizing state#033[00m Nov 23 05:03:36 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:36.981 2 INFO neutron.agent.securitygroups_rpc [None req-f95b4467-0d9c-4e77-aa60-8f1596442e50 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:36 localhost nova_compute[281952]: 2025-11-23 10:03:36.995 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:36.999 263258 INFO neutron.agent.dhcp.agent [None req-ae3e1a76-bd76-4077-9d87-65d863f72c3b - - - - - -] DHCP configuration for ports {'bfac63e5-6fee-4aa0-b816-de3604981a61'} is completed#033[00m Nov 23 05:03:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.162 263258 INFO neutron.agent.dhcp.agent [None req-7d16b2fb-0d37-486c-ac3a-228527858e8b - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 05:03:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e154 e154: 6 total, 6 up, 6 in Nov 23 05:03:37 localhost dnsmasq[329030]: exiting on receipt of SIGTERM Nov 23 05:03:37 localhost podman[329222]: 2025-11-23 10:03:37.364131645 +0000 UTC m=+0.064008899 container kill c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:37 localhost systemd[1]: libpod-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope: Deactivated successfully. Nov 23 05:03:37 localhost podman[329234]: 2025-11-23 10:03:37.42107809 +0000 UTC m=+0.044520992 container died c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:37 localhost podman[329234]: 2025-11-23 10:03:37.450503677 +0000 UTC m=+0.073946509 container cleanup c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:03:37 localhost systemd[1]: libpod-conmon-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35.scope: Deactivated successfully. Nov 23 05:03:37 localhost sshd[329259]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:03:37 localhost podman[329241]: 2025-11-23 10:03:37.526040243 +0000 UTC m=+0.132974457 container remove c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42ad5b5e-db5f-4e5a-846c-652f436fa783, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:03:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.568 263258 INFO neutron.agent.dhcp.agent [None req-116d79b7-3d46-461e-94b5-c6eecda7a59c - - - - - -] Synchronizing state complete#033[00m Nov 23 05:03:37 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:37.570 263258 INFO neutron.agent.dhcp.agent [None req-db6700fe-b5a8-400d-8f63-8836ad2ec7a0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2221bad8-4bd8-4391-a72d-5c149805734c, ip_allocation=immediate, mac_address=fa:16:3e:ff:95:cf, name=tempest-RoutersIpV6Test-1138851014, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:32Z, description=, dns_domain=, id=4a43af34-e77a-4d75-9a08-a9727e7ca345, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-14611547, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=13436, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2295, status=ACTIVE, subnets=['b13803ac-8d12-481f-abee-61d6c126b729'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:34Z, vlan_transparent=None, network_id=4a43af34-e77a-4d75-9a08-a9727e7ca345, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['57d92d06-0a9a-469b-b69f-4fb9e6e560cf'], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:35Z on network 4a43af34-e77a-4d75-9a08-a9727e7ca345#033[00m Nov 23 05:03:37 localhost systemd[1]: var-lib-containers-storage-overlay-669ca07686ba0518e574dcb34d9d6983765cc1732dd82e1f456d50492dd71e6f-merged.mount: Deactivated successfully. Nov 23 05:03:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7fcefb5c8da70383bb68d079f8bbfcc921709d7229e4e08dfd25f3c9a65ff35-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:37 localhost systemd[1]: run-netns-qdhcp\x2d42ad5b5e\x2ddb5f\x2d4e5a\x2d846c\x2d652f436fa783.mount: Deactivated successfully. Nov 23 05:03:37 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:37.617 2 INFO neutron.agent.securitygroups_rpc [None req-dffb97f7-0929-42aa-ac92-4890704581f2 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:37 localhost podman[329286]: 2025-11-23 10:03:37.792333004 +0000 UTC m=+0.065302649 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:03:37 localhost dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 1 addresses Nov 23 05:03:37 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host Nov 23 05:03:37 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts Nov 23 05:03:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e155 e155: 6 total, 6 up, 6 in Nov 23 05:03:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.037 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:35Z, description=, device_id=31c6ba21-90e0-41d9-88c7-cee3c38bbce7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2221bad8-4bd8-4391-a72d-5c149805734c, ip_allocation=immediate, mac_address=fa:16:3e:ff:95:cf, name=tempest-RoutersIpV6Test-1138851014, network_id=4a43af34-e77a-4d75-9a08-a9727e7ca345, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['57d92d06-0a9a-469b-b69f-4fb9e6e560cf'], standard_attr_id=2309, status=ACTIVE, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:03:37Z on network 4a43af34-e77a-4d75-9a08-a9727e7ca345#033[00m Nov 23 05:03:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.053 263258 INFO neutron.agent.dhcp.agent [None req-7baaf0bf-e893-4af3-83fe-d22af8afb311 - - - - - -] DHCP configuration for ports {'2221bad8-4bd8-4391-a72d-5c149805734c'} is completed#033[00m Nov 23 05:03:38 localhost dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 1 addresses Nov 23 05:03:38 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host Nov 23 05:03:38 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts Nov 23 05:03:38 localhost podman[329324]: 2025-11-23 10:03:38.233433011 +0000 UTC m=+0.062024239 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:03:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:38 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:38.867 263258 INFO neutron.agent.dhcp.agent [None req-79d7e5b3-beab-4433-9727-cba888515eb6 - - - - - -] DHCP configuration for ports {'2221bad8-4bd8-4391-a72d-5c149805734c'} is completed#033[00m Nov 23 05:03:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e156 e156: 6 total, 6 up, 6 in Nov 23 05:03:39 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:39.221 2 INFO neutron.agent.securitygroups_rpc [None req-9d87b2e2-86da-485b-bbea-cfc6692748e1 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:03:39 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:03:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:03:39 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1869128489' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:03:39 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:39.949 2 INFO neutron.agent.securitygroups_rpc [None req-0cc7d2c8-7386-4953-9ec2-42e302a377e6 a8a12d646f734219a5736bd9a89106d3 cd27ceae55c44d478998092e7554fd8a - - default default] Security group member updated ['57d92d06-0a9a-469b-b69f-4fb9e6e560cf']#033[00m Nov 23 05:03:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e157 e157: 6 total, 6 up, 6 in Nov 23 05:03:40 localhost dnsmasq[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/addn_hosts - 0 addresses Nov 23 05:03:40 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/host Nov 23 05:03:40 localhost dnsmasq-dhcp[329187]: read /var/lib/neutron/dhcp/4a43af34-e77a-4d75-9a08-a9727e7ca345/opts Nov 23 05:03:40 localhost podman[329410]: 2025-11-23 10:03:40.274425933 +0000 UTC m=+0.051541564 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:03:40 localhost kernel: device tapf441590c-f0 left promiscuous mode Nov 23 05:03:40 localhost ovn_controller[154788]: 2025-11-23T10:03:40Z|00457|binding|INFO|Releasing lport f441590c-f097-4418-b5c6-fc684c4c990b from this chassis (sb_readonly=0) Nov 23 05:03:40 localhost nova_compute[281952]: 2025-11-23 10:03:40.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:40 localhost ovn_controller[154788]: 2025-11-23T10:03:40Z|00458|binding|INFO|Setting lport f441590c-f097-4418-b5c6-fc684c4c990b down in Southbound Nov 23 05:03:40 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:40.479 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a43af34-e77a-4d75-9a08-a9727e7ca345', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe156e5f-7616-4580-b024-66548b9558a4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f441590c-f097-4418-b5c6-fc684c4c990b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:40 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:40.481 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f441590c-f097-4418-b5c6-fc684c4c990b in datapath 4a43af34-e77a-4d75-9a08-a9727e7ca345 unbound from our chassis#033[00m Nov 23 05:03:40 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:40.483 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4a43af34-e77a-4d75-9a08-a9727e7ca345 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:40 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:40.483 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9422df5d-2900-437f-9cd1-4ccf7d5b04fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:40 localhost nova_compute[281952]: 2025-11-23 10:03:40.485 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:40 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:40.590 2 INFO neutron.agent.securitygroups_rpc [None req-bd23e853-4daf-4921-98db-bb97f636a505 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:41 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:03:41 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:03:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:41.217 2 INFO neutron.agent.securitygroups_rpc [None req-8d10cea8-bd4a-4441-b879-9913f6e3c03c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:41 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e158 e158: 6 total, 6 up, 6 in Nov 23 05:03:41 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:41.712 2 INFO neutron.agent.securitygroups_rpc [None req-99b4e496-02de-4948-b003-eb8832f49bd1 92c363a19c824119a63f49b23b9d66e1 6fc3fa728d6f4403acd9944d81eaeb18 - - default default] Security group member updated ['33cf6d6a-dd4c-4287-83e4-f8bbab9f55b2']#033[00m Nov 23 05:03:41 localhost podman[240668]: time="2025-11-23T10:03:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:03:41 localhost podman[240668]: @ - - [23/Nov/2025:10:03:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159318 "" "Go-http-client/1.1" Nov 23 05:03:41 localhost podman[240668]: @ - - [23/Nov/2025:10:03:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20209 "" "Go-http-client/1.1" Nov 23 05:03:42 localhost nova_compute[281952]: 2025-11-23 10:03:42.036 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:42 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:42.185 2 INFO neutron.agent.securitygroups_rpc [None req-dbf580d8-52e8-4ecb-9364-932e14668854 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group rule updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']#033[00m Nov 23 05:03:42 localhost dnsmasq[329187]: exiting on receipt of SIGTERM Nov 23 05:03:42 localhost systemd[1]: libpod-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope: Deactivated successfully. Nov 23 05:03:42 localhost podman[329487]: 2025-11-23 10:03:42.206290657 +0000 UTC m=+0.059707360 container kill 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:03:42 localhost podman[329500]: 2025-11-23 10:03:42.276875414 +0000 UTC m=+0.059370821 container died 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:03:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:42 localhost podman[329500]: 2025-11-23 10:03:42.316077855 +0000 UTC m=+0.098573222 container cleanup 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:03:42 localhost systemd[1]: libpod-conmon-2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac.scope: Deactivated successfully. Nov 23 05:03:42 localhost podman[329502]: 2025-11-23 10:03:42.364596086 +0000 UTC m=+0.137386280 container remove 2669f56771038fdde41a2dd478f133cea8a7fb5c6597337f9c692c07f2f7b6ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4a43af34-e77a-4d75-9a08-a9727e7ca345, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:42.432 263258 INFO neutron.agent.dhcp.agent [None req-a5e4dc26-30c2-4e8a-904b-c9192946c94b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:42.601 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:42 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:42.748 2 INFO neutron.agent.securitygroups_rpc [None req-d8c6f897-4254-46a6-9c9c-683b3a672b23 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group rule updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']#033[00m Nov 23 05:03:42 localhost ovn_controller[154788]: 2025-11-23T10:03:42Z|00459|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:42 localhost nova_compute[281952]: 2025-11-23 10:03:42.788 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e159 e159: 6 total, 6 up, 6 in Nov 23 05:03:43 localhost systemd[1]: var-lib-containers-storage-overlay-efc00143707c6c7059c45fc469c5c19df6826f798c48ffab4fae84399025aa28-merged.mount: Deactivated successfully. Nov 23 05:03:43 localhost systemd[1]: run-netns-qdhcp\x2d4a43af34\x2de77a\x2d4d75\x2d9a08\x2da9727e7ca345.mount: Deactivated successfully. Nov 23 05:03:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:03:43 localhost dnsmasq[326522]: exiting on receipt of SIGTERM Nov 23 05:03:43 localhost podman[329547]: 2025-11-23 10:03:43.910326568 +0000 UTC m=+0.061224046 container kill 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:03:43 localhost systemd[1]: libpod-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope: Deactivated successfully. Nov 23 05:03:43 localhost podman[329561]: 2025-11-23 10:03:43.974552382 +0000 UTC m=+0.049898024 container died 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:03:44 localhost podman[329561]: 2025-11-23 10:03:44.012586077 +0000 UTC m=+0.087931679 container cleanup 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:44 localhost systemd[1]: libpod-conmon-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf.scope: Deactivated successfully. Nov 23 05:03:44 localhost podman[329563]: 2025-11-23 10:03:44.035507598 +0000 UTC m=+0.102156418 container remove 9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-725f5f75-c3ef-4a36-ba95-e1cd3131878c, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:03:44 localhost ovn_controller[154788]: 2025-11-23T10:03:44Z|00460|binding|INFO|Releasing lport 6f155903-a394-40bc-9c4e-04010e974788 from this chassis (sb_readonly=0) Nov 23 05:03:44 localhost kernel: device tap6f155903-a3 left promiscuous mode Nov 23 05:03:44 localhost nova_compute[281952]: 2025-11-23 10:03:44.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:44 localhost ovn_controller[154788]: 2025-11-23T10:03:44Z|00461|binding|INFO|Setting lport 6f155903-a394-40bc-9c4e-04010e974788 down in Southbound Nov 23 05:03:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:44.104 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-725f5f75-c3ef-4a36-ba95-e1cd3131878c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d782672f-ba9a-4b1f-9286-2b53b24a21c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6f155903-a394-40bc-9c4e-04010e974788) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:44.105 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6f155903-a394-40bc-9c4e-04010e974788 in datapath 725f5f75-c3ef-4a36-ba95-e1cd3131878c unbound from our chassis#033[00m Nov 23 05:03:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:44.106 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 725f5f75-c3ef-4a36-ba95-e1cd3131878c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:44.107 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed131f6-8928-4b92-828d-53e93133af7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:44 localhost nova_compute[281952]: 2025-11-23 10:03:44.110 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:44 localhost systemd[1]: var-lib-containers-storage-overlay-6b5117719bced5fdac21fc3bb297cfd580c490322e10e374d90cdd191e13b6d1-merged.mount: Deactivated successfully. Nov 23 05:03:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e0465b27b8316a2beb89942c597d0526a714651c2649c50fdb01fb359d124bf-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:44 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:44.289 2 INFO neutron.agent.securitygroups_rpc [None req-71dcb7fc-4cba-40db-b8c8-6bad4f6af9d0 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:44 localhost systemd[1]: run-netns-qdhcp\x2d725f5f75\x2dc3ef\x2d4a36\x2dba95\x2de1cd3131878c.mount: Deactivated successfully. Nov 23 05:03:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.401 263258 INFO neutron.agent.dhcp.agent [None req-1f63856d-d247-4b91-86d2-83fbcb612c6e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.402 263258 INFO neutron.agent.dhcp.agent [None req-1f63856d-d247-4b91-86d2-83fbcb612c6e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:44 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:44.510 2 INFO neutron.agent.securitygroups_rpc [None req-71dcb7fc-4cba-40db-b8c8-6bad4f6af9d0 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:44 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:44.633 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:44 localhost ovn_controller[154788]: 2025-11-23T10:03:44Z|00462|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:44 localhost nova_compute[281952]: 2025-11-23 10:03:44.935 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:45 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:45.128 2 INFO neutron.agent.securitygroups_rpc [None req-cef19edc-dbb5-4bcd-8945-0c2ead165d91 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:45.150 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:45.633 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:45.635 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:03:45 localhost nova_compute[281952]: 2025-11-23 10:03:45.635 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:46 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:46.279 2 INFO neutron.agent.securitygroups_rpc [None req-b1ac9adf-2928-4e8c-b93d-9a7afa468620 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:47 localhost nova_compute[281952]: 2025-11-23 10:03:47.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:47 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:47.439 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:03:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 e160: 6 total, 6 up, 6 in Nov 23 05:03:48 localhost systemd[1]: tmp-crun.bMhEoW.mount: Deactivated successfully. Nov 23 05:03:48 localhost podman[329593]: 2025-11-23 10:03:48.064938357 +0000 UTC m=+0.109134238 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:03:48 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:48.066 263258 INFO neutron.agent.linux.ip_lib [None req-8a5fbce6-c093-4db1-ae3f-ba673c83a5a6 - - - - - -] Device tapf2565742-27 cannot be used as it has no MAC address#033[00m Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.094 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost podman[329593]: 2025-11-23 10:03:48.102393575 +0000 UTC m=+0.146589416 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:03:48 localhost kernel: device tapf2565742-27 entered promiscuous mode Nov 23 05:03:48 localhost NetworkManager[5975]: [1763892228.1064] manager: (tapf2565742-27): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Nov 23 05:03:48 localhost podman[329594]: 2025-11-23 10:03:48.110088237 +0000 UTC m=+0.149839525 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 05:03:48 localhost systemd-udevd[329645]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:48 localhost podman[329594]: 2025-11-23 10:03:48.141610157 +0000 UTC m=+0.181361445 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.) Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00463|binding|INFO|Claiming lport f2565742-2705-427d-b837-b10dc9f90604 for this chassis. Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00464|binding|INFO|f2565742-2705-427d-b837-b10dc9f90604: Claiming unknown Nov 23 05:03:48 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.143 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00465|binding|INFO|Setting lport f2565742-2705-427d-b837-b10dc9f90604 ovn-installed in OVS Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.180 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00466|binding|INFO|Setting lport f2565742-2705-427d-b837-b10dc9f90604 up in Southbound Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.199 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d5704c4-8ab8-49b3-a1d9-b5f0c2d3d763, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f2565742-2705-427d-b837-b10dc9f90604) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.201 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f2565742-2705-427d-b837-b10dc9f90604 in datapath 44ad0888-d340-45c9-a658-e70067183c3d bound to our chassis#033[00m Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.203 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 44ad0888-d340-45c9-a658-e70067183c3d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:48 localhost podman[329592]: 2025-11-23 10:03:48.153424722 +0000 UTC m=+0.202874862 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0) Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.204 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[494c9323-26fd-4a4f-b900-127fe9a24f99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.219 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost podman[329592]: 2025-11-23 10:03:48.237465024 +0000 UTC m=+0.286915194 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0) Nov 23 05:03:48 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:48 localhost dnsmasq[325708]: exiting on receipt of SIGTERM Nov 23 05:03:48 localhost podman[329710]: 2025-11-23 10:03:48.761962064 +0000 UTC m=+0.059677579 container kill 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:03:48 localhost systemd[1]: libpod-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope: Deactivated successfully. Nov 23 05:03:48 localhost podman[329722]: 2025-11-23 10:03:48.835243691 +0000 UTC m=+0.055551184 container died 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:03:48 localhost podman[329722]: 2025-11-23 10:03:48.866859384 +0000 UTC m=+0.087166827 container cleanup 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:48 localhost systemd[1]: libpod-conmon-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49.scope: Deactivated successfully. Nov 23 05:03:48 localhost podman[329724]: 2025-11-23 10:03:48.915676584 +0000 UTC m=+0.129619876 container remove 017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-319f7ca3-1c18-4436-8178-bfc17a98eb45, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00467|binding|INFO|Releasing lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 from this chassis (sb_readonly=0) Nov 23 05:03:48 localhost ovn_controller[154788]: 2025-11-23T10:03:48Z|00468|binding|INFO|Setting lport e9b240b4-dda7-48fc-a63a-d3fd91217a97 down in Southbound Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:48 localhost kernel: device tape9b240b4-dd left promiscuous mode Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.953 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-319f7ca3-1c18-4436-8178-bfc17a98eb45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6cc558ab5ea444ca89055d39fcd5b762', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87fb399b-8c32-4da7-b979-46b50b5b7dd8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e9b240b4-dda7-48fc-a63a-d3fd91217a97) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.955 160439 INFO neutron.agent.ovn.metadata.agent [-] Port e9b240b4-dda7-48fc-a63a-d3fd91217a97 in datapath 319f7ca3-1c18-4436-8178-bfc17a98eb45 unbound from our chassis#033[00m Nov 23 05:03:48 localhost systemd[1]: tmp-crun.nabist.mount: Deactivated successfully. Nov 23 05:03:48 localhost systemd[1]: var-lib-containers-storage-overlay-0ed3cccae87a2cec8679f8b781e7106fb3edbfdb329ca120981a36091e9dbff3-merged.mount: Deactivated successfully. Nov 23 05:03:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-017142115db01e71c755d42a0a49b267b9bc4b2666aaf383f65c83bbe2952d49-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.964 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 319f7ca3-1c18-4436-8178-bfc17a98eb45 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:48 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:48.965 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[b26ed10d-1c7e-4666-bb39-5b0393b5c4e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:48 localhost nova_compute[281952]: 2025-11-23 10:03:48.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:49 localhost podman[329776]: Nov 23 05:03:49 localhost systemd[1]: run-netns-qdhcp\x2d319f7ca3\x2d1c18\x2d4436\x2d8178\x2dbfc17a98eb45.mount: Deactivated successfully. Nov 23 05:03:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.381 263258 INFO neutron.agent.dhcp.agent [None req-bfb1c85d-6963-4c65-8d70-8c0b1bd74c3c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:49 localhost podman[329776]: 2025-11-23 10:03:49.385830576 +0000 UTC m=+0.104837188 container create 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:49 localhost podman[329776]: 2025-11-23 10:03:49.331534951 +0000 UTC m=+0.050541603 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:49 localhost systemd[1]: Started libpod-conmon-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope. Nov 23 05:03:49 localhost systemd[1]: Started libcrun container. Nov 23 05:03:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc09d36762312268b4e2ea330956ffbfc2d1ae0c044e08cc5ab54ff63b2faae9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:49 localhost podman[329776]: 2025-11-23 10:03:49.478942231 +0000 UTC m=+0.197948833 container init 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:03:49 localhost podman[329776]: 2025-11-23 10:03:49.4878374 +0000 UTC m=+0.206844012 container start 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:49 localhost dnsmasq[329794]: started, version 2.85 cachesize 150 Nov 23 05:03:49 localhost dnsmasq[329794]: DNS service limited to local subnets Nov 23 05:03:49 localhost dnsmasq[329794]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:49 localhost dnsmasq[329794]: warning: no upstream servers configured Nov 23 05:03:49 localhost dnsmasq-dhcp[329794]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:49 localhost dnsmasq[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/addn_hosts - 0 addresses Nov 23 05:03:49 localhost dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/host Nov 23 05:03:49 localhost dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/opts Nov 23 05:03:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.674 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:49.722 263258 INFO neutron.agent.dhcp.agent [None req-5af237d5-3cb4-45aa-b223-d0feb72d77cd - - - - - -] DHCP configuration for ports {'0e48174f-0d37-4bdd-a9a9-fdda16cdb82b'} is completed#033[00m Nov 23 05:03:50 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:50.227 2 INFO neutron.agent.securitygroups_rpc [req-dc022d26-398e-4427-8b9e-d6e32e3174fc req-12998a72-36e8-4adc-96fc-04c6618198f0 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group member updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']#033[00m Nov 23 05:03:50 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:50.319 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:03:50 localhost ovn_controller[154788]: 2025-11-23T10:03:50Z|00469|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:50 localhost nova_compute[281952]: 2025-11-23 10:03:50.909 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:51 localhost ovn_controller[154788]: 2025-11-23T10:03:51Z|00470|binding|INFO|Removing iface tapf2565742-27 ovn-installed in OVS Nov 23 05:03:51 localhost ovn_controller[154788]: 2025-11-23T10:03:51Z|00471|binding|INFO|Removing lport f2565742-2705-427d-b837-b10dc9f90604 ovn-installed in OVS Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.259 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0727ed07-70eb-4362-adaa-42c5e2a55093 with type ""#033[00m Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.261 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-44ad0888-d340-45c9-a658-e70067183c3d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0aa997cf0e428b8c7e20c806754329', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d5704c4-8ab8-49b3-a1d9-b5f0c2d3d763, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f2565742-2705-427d-b837-b10dc9f90604) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:51 localhost nova_compute[281952]: 2025-11-23 10:03:51.261 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.264 160439 INFO neutron.agent.ovn.metadata.agent [-] Port f2565742-2705-427d-b837-b10dc9f90604 in datapath 44ad0888-d340-45c9-a658-e70067183c3d unbound from our chassis#033[00m Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.267 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 44ad0888-d340-45c9-a658-e70067183c3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:03:51 localhost nova_compute[281952]: 2025-11-23 10:03:51.268 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.268 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cedbeedb-43a3-4f90-96e3-04a292e701e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:51 localhost nova_compute[281952]: 2025-11-23 10:03:51.275 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:51 localhost kernel: device tapf2565742-27 left promiscuous mode Nov 23 05:03:51 localhost nova_compute[281952]: 2025-11-23 10:03:51.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:51 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:51.637 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:03:51 localhost dnsmasq[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/addn_hosts - 0 addresses Nov 23 05:03:51 localhost dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/host Nov 23 05:03:51 localhost podman[329814]: 2025-11-23 10:03:51.651214418 +0000 UTC m=+0.061945387 container kill 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:51 localhost dnsmasq-dhcp[329794]: read /var/lib/neutron/dhcp/44ad0888-d340-45c9-a658-e70067183c3d/opts Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent [None req-e340167e-5dcb-4505-a2f4-712bb4bf6d40 - - - - - -] Unable to reload_allocations dhcp for 44ad0888-d340-45c9-a658-e70067183c3d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf2565742-27 not found in namespace qdhcp-44ad0888-d340-45c9-a658-e70067183c3d. Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent return fut.result() Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent raise self._exception Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf2565742-27 not found in namespace qdhcp-44ad0888-d340-45c9-a658-e70067183c3d. Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.677 263258 ERROR neutron.agent.dhcp.agent #033[00m Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.680 263258 INFO neutron.agent.dhcp.agent [None req-116d79b7-3d46-461e-94b5-c6eecda7a59c - - - - - -] Synchronizing state#033[00m Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.797 263258 INFO neutron.agent.dhcp.agent [None req-37a3c96b-351c-4de9-96d3-7234122ba6a5 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.797 263258 INFO neutron.agent.dhcp.agent [-] Starting network 44ad0888-d340-45c9-a658-e70067183c3d dhcp configuration#033[00m Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.798 263258 INFO neutron.agent.dhcp.agent [-] Finished network 44ad0888-d340-45c9-a658-e70067183c3d dhcp configuration#033[00m Nov 23 05:03:51 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:51.799 263258 INFO neutron.agent.dhcp.agent [None req-37a3c96b-351c-4de9-96d3-7234122ba6a5 - - - - - -] Synchronizing state complete#033[00m Nov 23 05:03:51 localhost ovn_controller[154788]: 2025-11-23T10:03:51Z|00472|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:03:51 localhost nova_compute[281952]: 2025-11-23 10:03:51.951 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:52 localhost dnsmasq[329794]: exiting on receipt of SIGTERM Nov 23 05:03:52 localhost podman[329844]: 2025-11-23 10:03:52.055475475 +0000 UTC m=+0.067440763 container kill 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:52 localhost systemd[1]: tmp-crun.8NWlCd.mount: Deactivated successfully. Nov 23 05:03:52 localhost systemd[1]: libpod-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope: Deactivated successfully. Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.080 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:52 localhost podman[329857]: 2025-11-23 10:03:52.1286995 +0000 UTC m=+0.059905595 container died 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:52 localhost podman[329857]: 2025-11-23 10:03:52.159967682 +0000 UTC m=+0.091173737 container cleanup 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:52 localhost systemd[1]: libpod-conmon-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac.scope: Deactivated successfully. Nov 23 05:03:52 localhost podman[329859]: 2025-11-23 10:03:52.202621448 +0000 UTC m=+0.127439551 container remove 2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-44ad0888-d340-45c9-a658-e70067183c3d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:52 localhost systemd[1]: var-lib-containers-storage-overlay-bc09d36762312268b4e2ea330956ffbfc2d1ae0c044e08cc5ab54ff63b2faae9-merged.mount: Deactivated successfully. Nov 23 05:03:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2074f3aae575030a3fbdcaf964830667371abda3cd3182ac53b4c10fd38aeaac-userdata-shm.mount: Deactivated successfully. Nov 23 05:03:52 localhost systemd[1]: run-netns-qdhcp\x2d44ad0888\x2dd340\x2d45c9\x2da658\x2de70067183c3d.mount: Deactivated successfully. Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.938 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.959 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Triggering sync for uuid 355032bc-9946-4f6d-817c-2bfc8694d41d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.960 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "355032bc-9946-4f6d-817c-2bfc8694d41d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.961 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:03:52 localhost nova_compute[281952]: 2025-11-23 10:03:52.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "355032bc-9946-4f6d-817c-2bfc8694d41d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:03:52 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:52.993 2 INFO neutron.agent.securitygroups_rpc [None req-e884606d-3955-464b-8443-536f305941fb 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e161 e161: 6 total, 6 up, 6 in Nov 23 05:03:54 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:54.192 2 INFO neutron.agent.securitygroups_rpc [None req-5a886eea-af54-4cb9-a980-5c3836eff3f1 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:03:54 localhost snmpd[67457]: empty variable list in _query Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.076310) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235076338, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1464, "num_deletes": 262, "total_data_size": 2778890, "memory_usage": 2889520, "flush_reason": "Manual Compaction"} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235087021, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1826635, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25519, "largest_seqno": 26977, "table_properties": {"data_size": 1820430, "index_size": 3419, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14676, "raw_average_key_size": 21, "raw_value_size": 1807542, "raw_average_value_size": 2669, "num_data_blocks": 148, "num_entries": 677, "num_filter_entries": 677, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892168, "oldest_key_time": 1763892168, "file_creation_time": 1763892235, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10761 microseconds, and 4983 cpu microseconds. Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.087065) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1826635 bytes OK Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.087088) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089949) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089971) EVENT_LOG_v1 {"time_micros": 1763892235089964, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.089992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2771728, prev total WAL file size 2771728, number of live WAL files 2. Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.090761) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1783KB)], [45(14MB)] Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235090834, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 16754327, "oldest_snapshot_seqno": -1} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12589 keys, 15502277 bytes, temperature: kUnknown Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235165441, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 15502277, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15433424, "index_size": 36304, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 340950, "raw_average_key_size": 27, "raw_value_size": 15221571, "raw_average_value_size": 1209, "num_data_blocks": 1345, "num_entries": 12589, "num_filter_entries": 12589, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892235, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.166171) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 15502277 bytes Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.167921) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 224.2 rd, 207.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 14.2 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(17.7) write-amplify(8.5) OK, records in: 13128, records dropped: 539 output_compression: NoCompression Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.167952) EVENT_LOG_v1 {"time_micros": 1763892235167937, "job": 26, "event": "compaction_finished", "compaction_time_micros": 74722, "compaction_time_cpu_micros": 45616, "output_level": 6, "num_output_files": 1, "total_output_size": 15502277, "num_input_records": 13128, "num_output_records": 12589, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235168383, "job": 26, "event": "table_file_deletion", "file_number": 47} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892235170568, "job": 26, "event": "table_file_deletion", "file_number": 45} Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.090636) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170603) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170609) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170612) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:55 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:03:55.170618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:03:56 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e162 e162: 6 total, 6 up, 6 in Nov 23 05:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:03:57 localhost podman[329888]: 2025-11-23 10:03:57.040571903 +0000 UTC m=+0.099357964 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd) Nov 23 05:03:57 localhost podman[329888]: 2025-11-23 10:03:57.053511132 +0000 UTC m=+0.112297173 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3) Nov 23 05:03:57 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.082 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.127 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.128 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:03:57 localhost podman[329889]: 2025-11-23 10:03:57.164722233 +0000 UTC m=+0.215743980 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:03:57 localhost podman[329889]: 2025-11-23 10:03:57.175949381 +0000 UTC m=+0.226971138 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:03:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:57.180 263258 INFO neutron.agent.linux.ip_lib [None req-55eeec42-bcb3-4134-8456-e7be7be94fe9 - - - - - -] Device tap305095bc-16 cannot be used as it has no MAC address#033[00m Nov 23 05:03:57 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.201 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:57 localhost kernel: device tap305095bc-16 entered promiscuous mode Nov 23 05:03:57 localhost NetworkManager[5975]: [1763892237.2095] manager: (tap305095bc-16): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Nov 23 05:03:57 localhost ovn_controller[154788]: 2025-11-23T10:03:57Z|00473|binding|INFO|Claiming lport 305095bc-169f-4019-8019-b335745719a8 for this chassis. Nov 23 05:03:57 localhost ovn_controller[154788]: 2025-11-23T10:03:57Z|00474|binding|INFO|305095bc-169f-4019-8019-b335745719a8: Claiming unknown Nov 23 05:03:57 localhost systemd-udevd[329940]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.211 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:57.226 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '807b835f4cc944269d2f71f8e519b08a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7004edea-7321-4a7b-bb91-bf959c0155ab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=305095bc-169f-4019-8019-b335745719a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:03:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:57.228 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 305095bc-169f-4019-8019-b335745719a8 in datapath fbb2f473-9d45-472b-acf8-1ed5f2c6e75a bound to our chassis#033[00m Nov 23 05:03:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:57.230 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:03:57 localhost ovn_metadata_agent[160434]: 2025-11-23 10:03:57.230 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[2886fe62-bac5-4cf9-abf4-82db8087bc4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost ovn_controller[154788]: 2025-11-23T10:03:57Z|00475|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 ovn-installed in OVS Nov 23 05:03:57 localhost ovn_controller[154788]: 2025-11-23T10:03:57Z|00476|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 up in Southbound Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.242 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost journal[230249]: ethtool ioctl error on tap305095bc-16: No such device Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.274 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:57 localhost nova_compute[281952]: 2025-11-23 10:03:57.306 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost podman[330011]: Nov 23 05:03:58 localhost podman[330011]: 2025-11-23 10:03:58.172708607 +0000 UTC m=+0.094082346 container create 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:58 localhost systemd[1]: Started libpod-conmon-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope. Nov 23 05:03:58 localhost podman[330011]: 2025-11-23 10:03:58.126539135 +0000 UTC m=+0.047912904 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:58 localhost systemd[1]: Started libcrun container. Nov 23 05:03:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7be32356762f3029209aca713f7a677b1c24e82a292b776689aed1ed5de000e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:58 localhost podman[330011]: 2025-11-23 10:03:58.253612954 +0000 UTC m=+0.174986733 container init 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:03:58 localhost podman[330011]: 2025-11-23 10:03:58.267675758 +0000 UTC m=+0.189049507 container start 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:03:58 localhost dnsmasq[330031]: started, version 2.85 cachesize 150 Nov 23 05:03:58 localhost dnsmasq[330031]: DNS service limited to local subnets Nov 23 05:03:58 localhost dnsmasq[330031]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:58 localhost dnsmasq[330031]: warning: no upstream servers configured Nov 23 05:03:58 localhost dnsmasq-dhcp[330031]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:58 localhost dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 0 addresses Nov 23 05:03:58 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host Nov 23 05:03:58 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts Nov 23 05:03:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:03:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:58.431 263258 INFO neutron.agent.dhcp.agent [None req-66b4c661-3b71-417d-a81d-c2551ccf6024 - - - - - -] DHCP configuration for ports {'1b6f9917-dcd8-46f2-841b-9a5f608eb356'} is completed#033[00m Nov 23 05:03:58 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:58.437 263258 INFO neutron.agent.linux.ip_lib [None req-e53debe7-7303-47bd-8cb4-60fae38a0ec2 - - - - - -] Device tape9fe654f-69 cannot be used as it has no MAC address#033[00m Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.503 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost kernel: device tape9fe654f-69 entered promiscuous mode Nov 23 05:03:58 localhost NetworkManager[5975]: [1763892238.5092] manager: (tape9fe654f-69): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.519 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:58 localhost nova_compute[281952]: 2025-11-23 10:03:58.620 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:03:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:03:59.106 2 INFO neutron.agent.securitygroups_rpc [None req-73f9f53a-edf4-45e5-a635-4120a726bffe f436a64c9a134831a0f528309f399f1d 807b835f4cc944269d2f71f8e519b08a - - default default] Security group member updated ['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d']#033[00m Nov 23 05:03:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.161 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:03:58Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ef98dbbe-315b-4005-87bd-c9bb26809710, ip_allocation=immediate, mac_address=fa:16:3e:b3:2a:18, name=tempest-TagsExtTest-154991436, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:03:54Z, description=, dns_domain=, id=fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-852574996, port_security_enabled=True, project_id=807b835f4cc944269d2f71f8e519b08a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50479, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2406, status=ACTIVE, subnets=['9d3d60ab-4d93-4d58-a0ca-53c9dcb8d46d'], tags=[], tenant_id=807b835f4cc944269d2f71f8e519b08a, updated_at=2025-11-23T10:03:55Z, vlan_transparent=None, network_id=fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, port_security_enabled=True, project_id=807b835f4cc944269d2f71f8e519b08a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d'], standard_attr_id=2425, status=DOWN, tags=[], tenant_id=807b835f4cc944269d2f71f8e519b08a, updated_at=2025-11-23T10:03:58Z on network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a#033[00m Nov 23 05:03:59 localhost systemd[1]: tmp-crun.yCmdzb.mount: Deactivated successfully. Nov 23 05:03:59 localhost dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 1 addresses Nov 23 05:03:59 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host Nov 23 05:03:59 localhost podman[330095]: 2025-11-23 10:03:59.436688162 +0000 UTC m=+0.106392326 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:03:59 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts Nov 23 05:03:59 localhost podman[330123]: Nov 23 05:03:59 localhost podman[330123]: 2025-11-23 10:03:59.497722591 +0000 UTC m=+0.079219758 container create cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:59 localhost systemd[1]: Started libpod-conmon-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope. Nov 23 05:03:59 localhost podman[330123]: 2025-11-23 10:03:59.4615225 +0000 UTC m=+0.043019757 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:03:59 localhost systemd[1]: Started libcrun container. Nov 23 05:03:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db53714652613f07480fa88db9328a7f78acd740bb6a93da8aece7d20ced86cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:03:59 localhost podman[330123]: 2025-11-23 10:03:59.587641679 +0000 UTC m=+0.169138886 container init cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:03:59 localhost podman[330123]: 2025-11-23 10:03:59.596233728 +0000 UTC m=+0.177730925 container start cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:03:59 localhost dnsmasq[330150]: started, version 2.85 cachesize 150 Nov 23 05:03:59 localhost dnsmasq[330150]: DNS service limited to local subnets Nov 23 05:03:59 localhost dnsmasq[330150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:03:59 localhost dnsmasq[330150]: warning: no upstream servers configured Nov 23 05:03:59 localhost dnsmasq-dhcp[330150]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:03:59 localhost dnsmasq[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/addn_hosts - 0 addresses Nov 23 05:03:59 localhost dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/host Nov 23 05:03:59 localhost dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/opts Nov 23 05:03:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.720 263258 INFO neutron.agent.dhcp.agent [None req-9ea3a00e-253c-40d9-9521-ad2df9959247 - - - - - -] DHCP configuration for ports {'ef98dbbe-315b-4005-87bd-c9bb26809710'} is completed#033[00m Nov 23 05:03:59 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:03:59.888 263258 INFO neutron.agent.dhcp.agent [None req-a689ed5b-aeb8-4883-8959-019e8e9b5fea - - - - - -] DHCP configuration for ports {'29220730-8a13-4870-ad1d-d4b8339b0131'} is completed#033[00m Nov 23 05:03:59 localhost dnsmasq[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/addn_hosts - 0 addresses Nov 23 05:03:59 localhost dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/host Nov 23 05:03:59 localhost dnsmasq-dhcp[330150]: read /var/lib/neutron/dhcp/9b96e7fb-0af6-422a-9328-26ea617f94f5/opts Nov 23 05:03:59 localhost podman[330167]: 2025-11-23 10:03:59.937763836 +0000 UTC m=+0.056105771 container kill cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:03:59 localhost openstack_network_exporter[242668]: ERROR 10:03:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:03:59 localhost openstack_network_exporter[242668]: ERROR 10:03:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:03:59 localhost openstack_network_exporter[242668]: ERROR 10:03:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:03:59 localhost openstack_network_exporter[242668]: ERROR 10:03:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:03:59 localhost openstack_network_exporter[242668]: Nov 23 05:03:59 localhost openstack_network_exporter[242668]: ERROR 10:03:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:03:59 localhost openstack_network_exporter[242668]: Nov 23 05:04:00 localhost nova_compute[281952]: 2025-11-23 10:04:00.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:00 localhost kernel: device tape9fe654f-69 left promiscuous mode Nov 23 05:04:00 localhost nova_compute[281952]: 2025-11-23 10:04:00.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:04:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:04:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:04:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2561672654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:04:00 localhost podman[330209]: 2025-11-23 10:04:00.490809155 +0000 UTC m=+0.058356909 container kill cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:04:00 localhost dnsmasq[330150]: exiting on receipt of SIGTERM Nov 23 05:04:00 localhost systemd[1]: libpod-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope: Deactivated successfully. Nov 23 05:04:00 localhost podman[330221]: 2025-11-23 10:04:00.609715677 +0000 UTC m=+0.107360555 container died cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:04:00 localhost podman[330221]: 2025-11-23 10:04:00.642609968 +0000 UTC m=+0.140254826 container cleanup cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:04:00 localhost systemd[1]: libpod-conmon-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14.scope: Deactivated successfully. Nov 23 05:04:00 localhost podman[330223]: 2025-11-23 10:04:00.665887219 +0000 UTC m=+0.153138614 container remove cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9b96e7fb-0af6-422a-9328-26ea617f94f5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:04:01 localhost systemd[1]: var-lib-containers-storage-overlay-db53714652613f07480fa88db9328a7f78acd740bb6a93da8aece7d20ced86cf-merged.mount: Deactivated successfully. Nov 23 05:04:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb1c1036f98c2cec654d512040b4b8365c9b9e91a8219e909c69ca3ab3722f14-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:01 localhost systemd[1]: run-netns-qdhcp\x2d9b96e7fb\x2d0af6\x2d422a\x2d9328\x2d26ea617f94f5.mount: Deactivated successfully. Nov 23 05:04:02 localhost nova_compute[281952]: 2025-11-23 10:04:02.174 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 e163: 6 total, 6 up, 6 in Nov 23 05:04:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:04 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:04.015 263258 INFO neutron.agent.linux.ip_lib [None req-5980f272-a008-4562-9c6f-00c268543df4 - - - - - -] Device tap9d23bf51-48 cannot be used as it has no MAC address#033[00m Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:04 localhost kernel: device tap9d23bf51-48 entered promiscuous mode Nov 23 05:04:04 localhost NetworkManager[5975]: [1763892244.0848] manager: (tap9d23bf51-48): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.086 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:04 localhost ovn_controller[154788]: 2025-11-23T10:04:04Z|00477|binding|INFO|Claiming lport 9d23bf51-4878-4815-a311-5305afd6c960 for this chassis. Nov 23 05:04:04 localhost ovn_controller[154788]: 2025-11-23T10:04:04Z|00478|binding|INFO|9d23bf51-4878-4815-a311-5305afd6c960: Claiming unknown Nov 23 05:04:04 localhost systemd-udevd[330260]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:04.097 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e25b49f-2b5c-47dc-9e51-af7d0f597458, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9d23bf51-4878-4815-a311-5305afd6c960) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:04.099 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9d23bf51-4878-4815-a311-5305afd6c960 in datapath fb46dcb8-1e03-4e90-b074-b34f166ad626 bound to our chassis#033[00m Nov 23 05:04:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:04.104 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fb46dcb8-1e03-4e90-b074-b34f166ad626 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:04 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:04.107 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6062012f-ffc1-4f1c-9dbb-c7c0fae5a847]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost ovn_controller[154788]: 2025-11-23T10:04:04Z|00479|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 ovn-installed in OVS Nov 23 05:04:04 localhost ovn_controller[154788]: 2025-11-23T10:04:04Z|00480|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 up in Southbound Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.121 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.123 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost journal[230249]: ethtool ioctl error on tap9d23bf51-48: No such device Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.163 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:04 localhost nova_compute[281952]: 2025-11-23 10:04:04.196 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:05 localhost podman[330331]: Nov 23 05:04:05 localhost podman[330331]: 2025-11-23 10:04:05.089037629 +0000 UTC m=+0.101081236 container create 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:04:05 localhost systemd[1]: Started libpod-conmon-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope. Nov 23 05:04:05 localhost podman[330331]: 2025-11-23 10:04:05.041981781 +0000 UTC m=+0.054025408 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:05 localhost systemd[1]: tmp-crun.lEc4qv.mount: Deactivated successfully. Nov 23 05:04:05 localhost systemd[1]: Started libcrun container. Nov 23 05:04:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ee271f1a827858027c6783c7ec8838583352e6408914449aa6a883f1c3a25e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:05 localhost podman[330331]: 2025-11-23 10:04:05.190200786 +0000 UTC m=+0.202244413 container init 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:04:05 localhost podman[330331]: 2025-11-23 10:04:05.199737893 +0000 UTC m=+0.211781500 container start 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:05 localhost dnsmasq[330349]: started, version 2.85 cachesize 150 Nov 23 05:04:05 localhost dnsmasq[330349]: DNS service limited to local subnets Nov 23 05:04:05 localhost dnsmasq[330349]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:05 localhost dnsmasq[330349]: warning: no upstream servers configured Nov 23 05:04:05 localhost dnsmasq-dhcp[330349]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:04:05 localhost dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 0 addresses Nov 23 05:04:05 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host Nov 23 05:04:05 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts Nov 23 05:04:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.365 263258 INFO neutron.agent.dhcp.agent [None req-29820e85-86e1-4ebf-8ba9-57623ed7f6e5 - - - - - -] DHCP configuration for ports {'5fc5bd48-bf7c-4d0a-a432-46b8babd3b7a'} is completed#033[00m Nov 23 05:04:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.408 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:04Z, description=, device_id=58cf0f29-b2f0-4d2a-8c4c-099287c6e849, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e189f8c0-3b95-4fe4-bad8-556b71c8aa3c, ip_allocation=immediate, mac_address=fa:16:3e:f5:ab:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:01Z, description=, dns_domain=, id=fb46dcb8-1e03-4e90-b074-b34f166ad626, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-240847515, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43865, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2449, status=ACTIVE, subnets=['6e136941-a1bc-4b07-ae85-8009aea12ffe'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:03Z, vlan_transparent=None, network_id=fb46dcb8-1e03-4e90-b074-b34f166ad626, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2455, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:04Z on network fb46dcb8-1e03-4e90-b074-b34f166ad626#033[00m Nov 23 05:04:05 localhost dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 1 addresses Nov 23 05:04:05 localhost podman[330369]: 2025-11-23 10:04:05.618493348 +0000 UTC m=+0.066016360 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:05 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host Nov 23 05:04:05 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts Nov 23 05:04:05 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:05.907 263258 INFO neutron.agent.dhcp.agent [None req-d74b055e-3da3-4a47-81e8-3a6f4da0a475 - - - - - -] DHCP configuration for ports {'e189f8c0-3b95-4fe4-bad8-556b71c8aa3c'} is completed#033[00m Nov 23 05:04:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:04:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:04:07 localhost podman[330389]: 2025-11-23 10:04:07.031830551 +0000 UTC m=+0.084814195 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:04:07 localhost podman[330389]: 2025-11-23 10:04:07.044641478 +0000 UTC m=+0.097625172 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:04:07 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:04:07 localhost podman[330390]: 2025-11-23 10:04:07.142596498 +0000 UTC m=+0.193224871 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:04:07 localhost podman[330390]: 2025-11-23 10:04:07.211158403 +0000 UTC m=+0.261786776 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:04:07 localhost nova_compute[281952]: 2025-11-23 10:04:07.214 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:07 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:04:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:07.415 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:04Z, description=, device_id=58cf0f29-b2f0-4d2a-8c4c-099287c6e849, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e189f8c0-3b95-4fe4-bad8-556b71c8aa3c, ip_allocation=immediate, mac_address=fa:16:3e:f5:ab:4b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:01Z, description=, dns_domain=, id=fb46dcb8-1e03-4e90-b074-b34f166ad626, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-240847515, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43865, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2449, status=ACTIVE, subnets=['6e136941-a1bc-4b07-ae85-8009aea12ffe'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:03Z, vlan_transparent=None, network_id=fb46dcb8-1e03-4e90-b074-b34f166ad626, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2455, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:04Z on network fb46dcb8-1e03-4e90-b074-b34f166ad626#033[00m Nov 23 05:04:07 localhost dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 1 addresses Nov 23 05:04:07 localhost podman[330448]: 2025-11-23 10:04:07.620076871 +0000 UTC m=+0.061015189 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:04:07 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host Nov 23 05:04:07 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts Nov 23 05:04:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:07.669 2 INFO neutron.agent.securitygroups_rpc [None req-cd03e682-7688-4e93-ac2d-e601f5fc3971 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:07 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:07.894 263258 INFO neutron.agent.dhcp.agent [None req-cef8ef95-1676-4fbb-a0b7-14fe3e59f797 - - - - - -] DHCP configuration for ports {'e189f8c0-3b95-4fe4-bad8-556b71c8aa3c'} is completed#033[00m Nov 23 05:04:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:08.169 2 INFO neutron.agent.securitygroups_rpc [None req-3d60f928-a89d-481c-a25d-e6417d1d55cf 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:08.744 2 INFO neutron.agent.securitygroups_rpc [None req-4c8f3bc2-c43f-4c06-bcd4-666015157129 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:08.959 2 INFO neutron.agent.securitygroups_rpc [None req-fdb6f567-90f0-41d5-acb8-83f08adab1b1 f436a64c9a134831a0f528309f399f1d 807b835f4cc944269d2f71f8e519b08a - - default default] Security group member updated ['c2582f3e-b285-4f13-ba8c-38a0c5b47d8d']#033[00m Nov 23 05:04:09 localhost dnsmasq[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/addn_hosts - 0 addresses Nov 23 05:04:09 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/host Nov 23 05:04:09 localhost podman[330485]: 2025-11-23 10:04:09.201250971 +0000 UTC m=+0.059537475 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:04:09 localhost dnsmasq-dhcp[330031]: read /var/lib/neutron/dhcp/fbb2f473-9d45-472b-acf8-1ed5f2c6e75a/opts Nov 23 05:04:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:09.302 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:04:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:04:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:04:09 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:09.342 2 INFO neutron.agent.securitygroups_rpc [None req-b9ca6263-bc29-4379-89bf-449c3fc12e0d 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:09 localhost dnsmasq[330031]: exiting on receipt of SIGTERM Nov 23 05:04:09 localhost podman[330522]: 2025-11-23 10:04:09.827365081 +0000 UTC m=+0.057261265 container kill 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:09 localhost systemd[1]: libpod-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope: Deactivated successfully. Nov 23 05:04:09 localhost podman[330534]: 2025-11-23 10:04:09.895861895 +0000 UTC m=+0.053811932 container died 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:04:09 localhost podman[330534]: 2025-11-23 10:04:09.938062316 +0000 UTC m=+0.096012353 container cleanup 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:04:09 localhost systemd[1]: libpod-conmon-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e.scope: Deactivated successfully. Nov 23 05:04:09 localhost podman[330536]: 2025-11-23 10:04:09.971867745 +0000 UTC m=+0.120478130 container remove 9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:04:10 localhost ovn_controller[154788]: 2025-11-23T10:04:10Z|00481|binding|INFO|Releasing lport 305095bc-169f-4019-8019-b335745719a8 from this chassis (sb_readonly=0) Nov 23 05:04:10 localhost kernel: device tap305095bc-16 left promiscuous mode Nov 23 05:04:10 localhost ovn_controller[154788]: 2025-11-23T10:04:10Z|00482|binding|INFO|Setting lport 305095bc-169f-4019-8019-b335745719a8 down in Southbound Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.026 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbb2f473-9d45-472b-acf8-1ed5f2c6e75a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '807b835f4cc944269d2f71f8e519b08a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7004edea-7321-4a7b-bb91-bf959c0155ab, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=305095bc-169f-4019-8019-b335745719a8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.029 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 305095bc-169f-4019-8019-b335745719a8 in datapath fbb2f473-9d45-472b-acf8-1ed5f2c6e75a unbound from our chassis#033[00m Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.031 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.031 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbb2f473-9d45-472b-acf8-1ed5f2c6e75a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.032 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.032 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[8163c8c4-0e0a-489a-8723-4098f9ed5c33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:10.059 263258 INFO neutron.agent.dhcp.agent [None req-89cd6c38-4ad8-49af-832c-b992818f16bf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:10 localhost dnsmasq[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/addn_hosts - 0 addresses Nov 23 05:04:10 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/host Nov 23 05:04:10 localhost dnsmasq-dhcp[330349]: read /var/lib/neutron/dhcp/fb46dcb8-1e03-4e90-b074-b34f166ad626/opts Nov 23 05:04:10 localhost podman[330577]: 2025-11-23 10:04:10.079919189 +0000 UTC m=+0.042761649 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:04:10 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:10.173 2 INFO neutron.agent.securitygroups_rpc [None req-aba9c038-400a-4d01-8bf0-588461edf0a1 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:10 localhost systemd[1]: var-lib-containers-storage-overlay-b7be32356762f3029209aca713f7a677b1c24e82a292b776689aed1ed5de000e-merged.mount: Deactivated successfully. Nov 23 05:04:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9be057a6a9c84a19eb3f8c73363dea1eb98eb9c4c082607ca85c5130f557582e-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:10 localhost systemd[1]: run-netns-qdhcp\x2dfbb2f473\x2d9d45\x2d472b\x2dacf8\x2d1ed5f2c6e75a.mount: Deactivated successfully. Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.237 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost ovn_controller[154788]: 2025-11-23T10:04:10Z|00483|binding|INFO|Releasing lport 9d23bf51-4878-4815-a311-5305afd6c960 from this chassis (sb_readonly=0) Nov 23 05:04:10 localhost ovn_controller[154788]: 2025-11-23T10:04:10Z|00484|binding|INFO|Setting lport 9d23bf51-4878-4815-a311-5305afd6c960 down in Southbound Nov 23 05:04:10 localhost kernel: device tap9d23bf51-48 left promiscuous mode Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.248 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb46dcb8-1e03-4e90-b074-b34f166ad626', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e25b49f-2b5c-47dc-9e51-af7d0f597458, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9d23bf51-4878-4815-a311-5305afd6c960) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.250 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 9d23bf51-4878-4815-a311-5305afd6c960 in datapath fb46dcb8-1e03-4e90-b074-b34f166ad626 unbound from our chassis#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.251 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fb46dcb8-1e03-4e90-b074-b34f166ad626 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:10 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:10.252 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c273d104-cf0b-4b9c-a5c5-49d685cd7008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost nova_compute[281952]: 2025-11-23 10:04:10.267 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:10.978 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:11 localhost ovn_controller[154788]: 2025-11-23T10:04:11Z|00485|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:11 localhost nova_compute[281952]: 2025-11-23 10:04:11.562 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:11 localhost podman[240668]: time="2025-11-23T10:04:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:04:11 localhost podman[240668]: @ - - [23/Nov/2025:10:04:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155678 "" "Go-http-client/1.1" Nov 23 05:04:11 localhost podman[240668]: @ - - [23/Nov/2025:10:04:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19247 "" "Go-http-client/1.1" Nov 23 05:04:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:12.007 2 INFO neutron.agent.securitygroups_rpc [None req-b1d88626-831d-4bca-895f-9342c26bbcc2 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:12 localhost dnsmasq[330349]: exiting on receipt of SIGTERM Nov 23 05:04:12 localhost podman[330615]: 2025-11-23 10:04:12.181984 +0000 UTC m=+0.063062491 container kill 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:12 localhost systemd[1]: libpod-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope: Deactivated successfully. Nov 23 05:04:12 localhost nova_compute[281952]: 2025-11-23 10:04:12.217 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:12 localhost podman[330630]: 2025-11-23 10:04:12.228984965 +0000 UTC m=+0.032484029 container died 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:04:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:12 localhost systemd[1]: var-lib-containers-storage-overlay-9ee271f1a827858027c6783c7ec8838583352e6408914449aa6a883f1c3a25e4-merged.mount: Deactivated successfully. Nov 23 05:04:12 localhost podman[330630]: 2025-11-23 10:04:12.266215657 +0000 UTC m=+0.069714711 container remove 146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb46dcb8-1e03-4e90-b074-b34f166ad626, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:12 localhost systemd[1]: libpod-conmon-146d4939eafd4a440493513a62a3ea1915af8b0e1508d846dafe3e854069ab5a.scope: Deactivated successfully. Nov 23 05:04:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:12.462 263258 INFO neutron.agent.dhcp.agent [None req-d61d22d5-ba6f-424d-9828-c48bc8239139 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:12.463 263258 INFO neutron.agent.dhcp.agent [None req-d61d22d5-ba6f-424d-9828-c48bc8239139 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:13 localhost systemd[1]: run-netns-qdhcp\x2dfb46dcb8\x2d1e03\x2d4e90\x2db074\x2db34f166ad626.mount: Deactivated successfully. Nov 23 05:04:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:13.375 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:15.641 263258 INFO neutron.agent.linux.ip_lib [None req-76c51997-687f-4567-b5c6-457cc8e31ad8 - - - - - -] Device tapa41f1fd9-25 cannot be used as it has no MAC address#033[00m Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.667 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:15 localhost kernel: device tapa41f1fd9-25 entered promiscuous mode Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.675 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:15 localhost NetworkManager[5975]: [1763892255.6757] manager: (tapa41f1fd9-25): new Generic device (/org/freedesktop/NetworkManager/Devices/78) Nov 23 05:04:15 localhost ovn_controller[154788]: 2025-11-23T10:04:15Z|00486|binding|INFO|Claiming lport a41f1fd9-25d7-4c85-96e2-18d396386762 for this chassis. Nov 23 05:04:15 localhost ovn_controller[154788]: 2025-11-23T10:04:15Z|00487|binding|INFO|a41f1fd9-25d7-4c85-96e2-18d396386762: Claiming unknown Nov 23 05:04:15 localhost systemd-udevd[330663]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:15.689 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a41f1fd9-25d7-4c85-96e2-18d396386762) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:15.693 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a41f1fd9-25d7-4c85-96e2-18d396386762 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e bound to our chassis#033[00m Nov 23 05:04:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:15.694 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:15 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:15.696 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[3baab3fe-cacc-4604-a218-719161fccc6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.705 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:15 localhost ovn_controller[154788]: 2025-11-23T10:04:15Z|00488|binding|INFO|Setting lport a41f1fd9-25d7-4c85-96e2-18d396386762 ovn-installed in OVS Nov 23 05:04:15 localhost ovn_controller[154788]: 2025-11-23T10:04:15Z|00489|binding|INFO|Setting lport a41f1fd9-25d7-4c85-96e2-18d396386762 up in Southbound Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.710 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost journal[230249]: ethtool ioctl error on tapa41f1fd9-25: No such device Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.753 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:15 localhost nova_compute[281952]: 2025-11-23 10:04:15.784 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:16 localhost podman[330734]: Nov 23 05:04:16 localhost podman[330734]: 2025-11-23 10:04:16.8030727 +0000 UTC m=+0.088289030 container create d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:04:16 localhost systemd[1]: Started libpod-conmon-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope. Nov 23 05:04:16 localhost systemd[1]: Started libcrun container. Nov 23 05:04:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69ba135d8b099b630e853bde2c2d90fa92899019265ae0b9907bbcf5750d8edb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:16 localhost podman[330734]: 2025-11-23 10:04:16.762526709 +0000 UTC m=+0.047743089 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:16 localhost podman[330734]: 2025-11-23 10:04:16.869333376 +0000 UTC m=+0.154549736 container init d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:16 localhost podman[330734]: 2025-11-23 10:04:16.880434441 +0000 UTC m=+0.165650791 container start d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:04:16 localhost dnsmasq[330752]: started, version 2.85 cachesize 150 Nov 23 05:04:16 localhost dnsmasq[330752]: DNS service limited to local subnets Nov 23 05:04:16 localhost dnsmasq[330752]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:16 localhost dnsmasq[330752]: warning: no upstream servers configured Nov 23 05:04:16 localhost dnsmasq-dhcp[330752]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:04:16 localhost dnsmasq[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses Nov 23 05:04:16 localhost dnsmasq-dhcp[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:16 localhost dnsmasq-dhcp[330752]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:17 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:17.015 263258 INFO neutron.agent.dhcp.agent [None req-427d2825-c3cf-4df7-9137-6592cbc40c01 - - - - - -] DHCP configuration for ports {'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.220 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.232 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.233 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.233 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:04:17 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e164 e164: 6 total, 6 up, 6 in Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.605 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.606 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.606 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:04:17 localhost nova_compute[281952]: 2025-11-23 10:04:17.607 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:04:17 localhost systemd[1]: tmp-crun.Z89cRe.mount: Deactivated successfully. Nov 23 05:04:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 05:04:18 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4044330584' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 05:04:18 localhost systemd[1]: tmp-crun.xzNgxZ.mount: Deactivated successfully. Nov 23 05:04:18 localhost dnsmasq[330752]: exiting on receipt of SIGTERM Nov 23 05:04:18 localhost podman[330769]: 2025-11-23 10:04:18.693616178 +0000 UTC m=+0.060311777 container kill d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:04:18 localhost systemd[1]: libpod-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope: Deactivated successfully. Nov 23 05:04:18 localhost podman[330784]: 2025-11-23 10:04:18.771257697 +0000 UTC m=+0.053813062 container died d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:04:18 localhost nova_compute[281952]: 2025-11-23 10:04:18.788 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:04:18 localhost podman[330792]: 2025-11-23 10:04:18.810037555 +0000 UTC m=+0.083648331 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41) Nov 23 05:04:18 localhost podman[330792]: 2025-11-23 10:04:18.85933572 +0000 UTC m=+0.132946546 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 05:04:18 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:04:18 localhost podman[330791]: 2025-11-23 10:04:18.919043479 +0000 UTC m=+0.196655866 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Nov 23 05:04:18 localhost podman[330786]: 2025-11-23 10:04:18.959615931 +0000 UTC m=+0.239622830 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:04:18 localhost podman[330791]: 2025-11-23 10:04:18.9788579 +0000 UTC m=+0.256470297 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:19 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:04:19 localhost nova_compute[281952]: 2025-11-23 10:04:19.021 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:04:19 localhost nova_compute[281952]: 2025-11-23 10:04:19.022 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:04:19 localhost nova_compute[281952]: 2025-11-23 10:04:19.022 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:19 localhost nova_compute[281952]: 2025-11-23 10:04:19.023 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:19 localhost nova_compute[281952]: 2025-11-23 10:04:19.023 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:04:19 localhost podman[330786]: 2025-11-23 10:04:19.035300341 +0000 UTC m=+0.315307250 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:04:19 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:04:19 localhost podman[330784]: 2025-11-23 10:04:19.115156377 +0000 UTC m=+0.397711742 container remove d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0) Nov 23 05:04:19 localhost systemd[1]: libpod-conmon-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e.scope: Deactivated successfully. Nov 23 05:04:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e165 e165: 6 total, 6 up, 6 in Nov 23 05:04:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:19.680 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:9f:5c 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b87e3e64-b6cc-4f08-95a6-de593e031494) old=Port_Binding(mac=['fa:16:3e:af:9f:5c 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:19.682 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b87e3e64-b6cc-4f08-95a6-de593e031494 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e updated#033[00m Nov 23 05:04:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:19.685 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port d51081f3-52bd-41f7-b871-b291aa0ee588 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:04:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:19.685 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:04:19 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:19.686 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[9075d485-ddfa-4a72-aa09-e7d97ab604e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:19 localhost systemd[1]: var-lib-containers-storage-overlay-69ba135d8b099b630e853bde2c2d90fa92899019265ae0b9907bbcf5750d8edb-merged.mount: Deactivated successfully. Nov 23 05:04:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d568164dd7b24a064c0675fa91f10cc5cfa9912435768d6b04663a1404cd8c7e-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:20.373 263258 INFO neutron.agent.linux.ip_lib [None req-9ddd3a60-f3f4-410e-a986-5a8b0041fb18 - - - - - -] Device tap39f9520b-e6 cannot be used as it has no MAC address#033[00m Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.405 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:20 localhost kernel: device tap39f9520b-e6 entered promiscuous mode Nov 23 05:04:20 localhost NetworkManager[5975]: [1763892260.4133] manager: (tap39f9520b-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/79) Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.413 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:20 localhost systemd-udevd[330903]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:20 localhost ovn_controller[154788]: 2025-11-23T10:04:20Z|00490|binding|INFO|Claiming lport 39f9520b-e67f-48de-9178-ad0c9c37f804 for this chassis. Nov 23 05:04:20 localhost ovn_controller[154788]: 2025-11-23T10:04:20Z|00491|binding|INFO|39f9520b-e67f-48de-9178-ad0c9c37f804: Claiming unknown Nov 23 05:04:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:20.430 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8633d61c76748a7a900f3c8cea84ef3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de783514-b079-4988-843f-abee02f82863, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=39f9520b-e67f-48de-9178-ad0c9c37f804) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:20.432 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 39f9520b-e67f-48de-9178-ad0c9c37f804 in datapath 556c5c2e-c414-4271-8e77-d61a599ccbad bound to our chassis#033[00m Nov 23 05:04:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:20.435 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 556c5c2e-c414-4271-8e77-d61a599ccbad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:20.436 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[ff08a49a-33a9-4f97-bd6d-45896ba50239]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:20 localhost ovn_controller[154788]: 2025-11-23T10:04:20Z|00492|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 ovn-installed in OVS Nov 23 05:04:20 localhost ovn_controller[154788]: 2025-11-23T10:04:20Z|00493|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 up in Southbound Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.453 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.512 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:20 localhost nova_compute[281952]: 2025-11-23 10:04:20.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:21 localhost podman[330960]: Nov 23 05:04:21 localhost podman[330960]: 2025-11-23 10:04:21.077544459 +0000 UTC m=+0.092109036 container create 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:04:21 localhost systemd[1]: Started libpod-conmon-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope. Nov 23 05:04:21 localhost podman[330960]: 2025-11-23 10:04:21.032651387 +0000 UTC m=+0.047215994 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:21 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:21.142 2 INFO neutron.agent.securitygroups_rpc [None req-b505d753-a321-4285-8b8a-57c320b6a991 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:21 localhost systemd[1]: Started libcrun container. Nov 23 05:04:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e452c8f92d1256ad9f801b589d5525f70ce43f85a0d2fd79bc7391c3f9d88b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:21 localhost podman[330960]: 2025-11-23 10:04:21.162448726 +0000 UTC m=+0.177013313 container init 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:21 localhost podman[330960]: 2025-11-23 10:04:21.171273423 +0000 UTC m=+0.185838010 container start 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:21 localhost dnsmasq[330981]: started, version 2.85 cachesize 150 Nov 23 05:04:21 localhost dnsmasq[330981]: DNS service limited to local subnets Nov 23 05:04:21 localhost dnsmasq[330981]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:21 localhost dnsmasq[330981]: warning: no upstream servers configured Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: DHCP, static leases only on 10.100.0.16, lease time 1d Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:04:21 localhost dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:21 localhost nova_compute[281952]: 2025-11-23 10:04:21.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.236 263258 INFO neutron.agent.dhcp.agent [None req-69a79c0f-ed94-4296-ad2f-407a6886f388 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:12Z, description=, dns_domain=, id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-571660484, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15506, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=2492, status=ACTIVE, subnets=['10c87fae-337c-47db-8d78-19673fc519f2', 'd18e15c0-46c0-4162-ac14-6406dd01bc14'], tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:20Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e#033[00m Nov 23 05:04:21 localhost podman[331005]: Nov 23 05:04:21 localhost podman[331005]: 2025-11-23 10:04:21.479206418 +0000 UTC m=+0.105075116 container create 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:21 localhost systemd[1]: Started libpod-conmon-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope. Nov 23 05:04:21 localhost systemd[1]: Started libcrun container. Nov 23 05:04:21 localhost podman[331005]: 2025-11-23 10:04:21.4314754 +0000 UTC m=+0.057344128 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ddc73133ee97a18b5d676ae71aebdcc91b4415d41e32db3c006d1af969d02b28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.540 263258 INFO neutron.agent.dhcp.agent [None req-536422c3-25b8-428a-aab5-e2ae20ae31e6 - - - - - -] DHCP configuration for ports {'a41f1fd9-25d7-4c85-96e2-18d396386762', 'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed#033[00m Nov 23 05:04:21 localhost dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 2 addresses Nov 23 05:04:21 localhost podman[331032]: 2025-11-23 10:04:21.551086904 +0000 UTC m=+0.063243996 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:21 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:21 localhost podman[331005]: 2025-11-23 10:04:21.592861372 +0000 UTC m=+0.218730070 container init 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:04:21 localhost podman[331005]: 2025-11-23 10:04:21.601366478 +0000 UTC m=+0.227235176 container start 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:04:21 localhost dnsmasq[331054]: started, version 2.85 cachesize 150 Nov 23 05:04:21 localhost dnsmasq[331054]: DNS service limited to local subnets Nov 23 05:04:21 localhost dnsmasq[331054]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:21 localhost dnsmasq[331054]: warning: no upstream servers configured Nov 23 05:04:21 localhost dnsmasq-dhcp[331054]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:04:21 localhost dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 0 addresses Nov 23 05:04:21 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host Nov 23 05:04:21 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts Nov 23 05:04:21 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e166 e166: 6 total, 6 up, 6 in Nov 23 05:04:21 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:21.912 263258 INFO neutron.agent.dhcp.agent [None req-a529ca7a-60ec-4769-8137-40df9acc23c6 - - - - - -] DHCP configuration for ports {'4e58acd3-0ca1-4d41-96d8-20dc84b68999', '27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed#033[00m Nov 23 05:04:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:22.006 263258 INFO neutron.agent.linux.ip_lib [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Device tap1ae8d8fe-51 cannot be used as it has no MAC address#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.071 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost kernel: device tap1ae8d8fe-51 entered promiscuous mode Nov 23 05:04:22 localhost systemd-udevd[330905]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:22 localhost NetworkManager[5975]: [1763892262.0800] manager: (tap1ae8d8fe-51): new Generic device (/org/freedesktop/NetworkManager/Devices/80) Nov 23 05:04:22 localhost ovn_controller[154788]: 2025-11-23T10:04:22Z|00494|binding|INFO|Claiming lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 for this chassis. Nov 23 05:04:22 localhost ovn_controller[154788]: 2025-11-23T10:04:22Z|00495|binding|INFO|1ae8d8fe-5156-4993-b482-c4b01b921e85: Claiming unknown Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:22.094 2 INFO neutron.agent.securitygroups_rpc [None req-71e1d61a-9581-46c3-850d-9298f6399521 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:22 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:22.097 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e80905-5b14-4ffd-9b3d-73c1c6bb812f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1ae8d8fe-5156-4993-b482-c4b01b921e85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:22 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:22.099 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1ae8d8fe-5156-4993-b482-c4b01b921e85 in datapath 2fa353bd-02c5-4044-adba-b918030b206e bound to our chassis#033[00m Nov 23 05:04:22 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:22.100 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fa353bd-02c5-4044-adba-b918030b206e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:22 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:22.101 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[55c22863-6c4d-4f0a-a928-78339ce4acc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:22.110 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:21Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e#033[00m Nov 23 05:04:22 localhost ovn_controller[154788]: 2025-11-23T10:04:22Z|00496|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 ovn-installed in OVS Nov 23 05:04:22 localhost ovn_controller[154788]: 2025-11-23T10:04:22Z|00497|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 up in Southbound Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.205 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.239 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.241 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.242 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:04:22 localhost dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 1 addresses Nov 23 05:04:22 localhost podman[331097]: 2025-11-23 10:04:22.347000519 +0000 UTC m=+0.051671017 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:04:22 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:22 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:04:22 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/299102806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.691 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:04:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e167 e167: 6 total, 6 up, 6 in Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.757 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.992 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.994 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11083MB free_disk=41.70033645629883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.995 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:04:22 localhost nova_compute[281952]: 2025-11-23 10:04:22.995 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.053 263258 INFO neutron.agent.dhcp.agent [None req-a93a8cc3-d826-4dd1-86d8-7c1d7a7ec395 - - - - - -] DHCP configuration for ports {'27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.090 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.091 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.117 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.144 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.144 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 05:04:23 localhost podman[331180]: Nov 23 05:04:23 localhost podman[331180]: 2025-11-23 10:04:23.166687951 +0000 UTC m=+0.071549237 container create 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.178 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 05:04:23 localhost systemd[1]: Started libpod-conmon-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope. Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.205 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 05:04:23 localhost systemd[1]: tmp-crun.eL97H9.mount: Deactivated successfully. Nov 23 05:04:23 localhost systemd[1]: Started libcrun container. Nov 23 05:04:23 localhost podman[331180]: 2025-11-23 10:04:23.124955324 +0000 UTC m=+0.029816640 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf0b02e359ac1721f19ba2c6fc86978494592aa5ddd27403653b646396ae9cc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:23 localhost podman[331180]: 2025-11-23 10:04:23.235779502 +0000 UTC m=+0.140640788 container init 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3) Nov 23 05:04:23 localhost podman[331180]: 2025-11-23 10:04:23.245043961 +0000 UTC m=+0.149905257 container start 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:23 localhost dnsmasq[331198]: started, version 2.85 cachesize 150 Nov 23 05:04:23 localhost dnsmasq[331198]: DNS service limited to local subnets Nov 23 05:04:23 localhost dnsmasq[331198]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:23 localhost dnsmasq[331198]: warning: no upstream servers configured Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Nov 23 05:04:23 localhost dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 0 addresses Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.281 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.287 263258 INFO neutron.agent.dhcp.agent [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:21Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0fe884b5-722e-4d25-b56d-fbb9db7a0a8a, ip_allocation=immediate, mac_address=fa:16:3e:10:68:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:18Z, description=, dns_domain=, id=2fa353bd-02c5-4044-adba-b918030b206e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-781167405, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46422, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2539, status=ACTIVE, subnets=['4b70b84e-2158-4880-972f-dc112d8544d2'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:20Z, vlan_transparent=None, network_id=2fa353bd-02c5-4044-adba-b918030b206e, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2548, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:21Z on network 2fa353bd-02c5-4044-adba-b918030b206e#033[00m Nov 23 05:04:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.414 263258 INFO neutron.agent.dhcp.agent [None req-58d8e269-1697-4c21-82a2-1372a0442ded - - - - - -] DHCP configuration for ports {'2145a468-64d1-418f-ba24-1e61ff91c9e8'} is completed#033[00m Nov 23 05:04:23 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:23.449 2 INFO neutron.agent.securitygroups_rpc [None req-360628bd-6ea7-46e4-a35e-1747acc2d18c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.467 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:20Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=27ad7b49-380b-4823-9acc-84d37fa1a881, ip_allocation=immediate, mac_address=fa:16:3e:3f:cd:97, name=tempest-PortsTestJSON-278306433, network_id=accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, port_security_enabled=True, project_id=6eb850a1541d4942b249428ef6092e5e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['cadd5356-9a8d-419a-ac04-589c2522a695'], standard_attr_id=2545, status=DOWN, tags=[], tenant_id=6eb850a1541d4942b249428ef6092e5e, updated_at=2025-11-23T10:04:23Z on network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.513 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:23 localhost dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 1 addresses Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host Nov 23 05:04:23 localhost podman[331233]: 2025-11-23 10:04:23.547040549 +0000 UTC m=+0.112213352 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.723 263258 INFO neutron.agent.dhcp.agent [None req-7dd5ffbc-8c5b-42f2-812d-a91f1617a370 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:21Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0fe884b5-722e-4d25-b56d-fbb9db7a0a8a, ip_allocation=immediate, mac_address=fa:16:3e:10:68:e0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:18Z, description=, dns_domain=, id=2fa353bd-02c5-4044-adba-b918030b206e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-781167405, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46422, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2539, status=ACTIVE, subnets=['4b70b84e-2158-4880-972f-dc112d8544d2'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:20Z, vlan_transparent=None, network_id=2fa353bd-02c5-4044-adba-b918030b206e, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2548, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:21Z on network 2fa353bd-02c5-4044-adba-b918030b206e#033[00m Nov 23 05:04:23 localhost dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 2 addresses Nov 23 05:04:23 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e168 e168: 6 total, 6 up, 6 in Nov 23 05:04:23 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:23 localhost podman[331269]: 2025-11-23 10:04:23.730382421 +0000 UTC m=+0.060427722 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:04:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:04:23 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/313014762' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.776 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.785 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.817 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.821 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:04:23 localhost nova_compute[281952]: 2025-11-23 10:04:23.822 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.873 263258 INFO neutron.agent.dhcp.agent [None req-66d0d4ff-854f-48ce-9bee-0c9d10a9a941 - - - - - -] DHCP configuration for ports {'0fe884b5-722e-4d25-b56d-fbb9db7a0a8a'} is completed#033[00m Nov 23 05:04:23 localhost dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 1 addresses Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host Nov 23 05:04:23 localhost podman[331307]: 2025-11-23 10:04:23.911312062 +0000 UTC m=+0.062976219 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:04:23 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts Nov 23 05:04:23 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:23.984 2 INFO neutron.agent.securitygroups_rpc [None req-53980d87-428d-4527-8575-9963b178026f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:23 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:23.988 263258 INFO neutron.agent.dhcp.agent [None req-be57a20f-d712-45c5-97f5-2a326ff96e22 - - - - - -] DHCP configuration for ports {'27ad7b49-380b-4823-9acc-84d37fa1a881'} is completed#033[00m Nov 23 05:04:24 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:24.161 263258 INFO neutron.agent.dhcp.agent [None req-0e7a1307-7a69-45b0-a8d7-fd2a09ad26d5 - - - - - -] DHCP configuration for ports {'0fe884b5-722e-4d25-b56d-fbb9db7a0a8a'} is completed#033[00m Nov 23 05:04:24 localhost dnsmasq[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses Nov 23 05:04:24 localhost podman[331348]: 2025-11-23 10:04:24.259537081 +0000 UTC m=+0.056036589 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:24 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:24 localhost dnsmasq-dhcp[330981]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:25 localhost dnsmasq[330981]: exiting on receipt of SIGTERM Nov 23 05:04:25 localhost podman[331386]: 2025-11-23 10:04:25.129042833 +0000 UTC m=+0.076772123 container kill 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:04:25 localhost systemd[1]: libpod-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope: Deactivated successfully. Nov 23 05:04:25 localhost systemd[1]: tmp-crun.Rj1YSU.mount: Deactivated successfully. Nov 23 05:04:25 localhost podman[331399]: 2025-11-23 10:04:25.203844517 +0000 UTC m=+0.057723081 container died 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:04:25 localhost podman[331399]: 2025-11-23 10:04:25.24314479 +0000 UTC m=+0.097023304 container cleanup 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:04:25 localhost systemd[1]: libpod-conmon-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498.scope: Deactivated successfully. Nov 23 05:04:25 localhost podman[331400]: 2025-11-23 10:04:25.281556458 +0000 UTC m=+0.130829062 container remove 56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:04:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e169 e169: 6 total, 6 up, 6 in Nov 23 05:04:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:04:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:04:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:04:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3854060828' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:04:26 localhost podman[331475]: Nov 23 05:04:26 localhost systemd[1]: var-lib-containers-storage-overlay-9e452c8f92d1256ad9f801b589d5525f70ce43f85a0d2fd79bc7391c3f9d88b0-merged.mount: Deactivated successfully. Nov 23 05:04:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56264562a55c9338b18378548c9a80c692b9d47225ae5f13ba1df351a679e498-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:26 localhost podman[331475]: 2025-11-23 10:04:26.172922628 +0000 UTC m=+0.075158465 container create 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:04:26 localhost systemd[1]: Started libpod-conmon-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope. Nov 23 05:04:26 localhost systemd[1]: Started libcrun container. Nov 23 05:04:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b39ba390b209ce21a98a569539df45ea77948795ea2108691055c1abc7e9596/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:26 localhost podman[331475]: 2025-11-23 10:04:26.233980148 +0000 UTC m=+0.136215985 container init 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:26 localhost podman[331475]: 2025-11-23 10:04:26.141130671 +0000 UTC m=+0.043366568 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:26 localhost podman[331475]: 2025-11-23 10:04:26.241117332 +0000 UTC m=+0.143353169 container start 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:04:26 localhost dnsmasq[331493]: started, version 2.85 cachesize 150 Nov 23 05:04:26 localhost dnsmasq[331493]: DNS service limited to local subnets Nov 23 05:04:26 localhost dnsmasq[331493]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:26 localhost dnsmasq[331493]: warning: no upstream servers configured Nov 23 05:04:26 localhost dnsmasq-dhcp[331493]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:04:26 localhost dnsmasq[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/addn_hosts - 0 addresses Nov 23 05:04:26 localhost dnsmasq-dhcp[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/host Nov 23 05:04:26 localhost dnsmasq-dhcp[331493]: read /var/lib/neutron/dhcp/accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e/opts Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.431 160439 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d51081f3-52bd-41f7-b871-b291aa0ee588 with type ""#033[00m Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00498|binding|INFO|Removing iface tapa41f1fd9-25 ovn-installed in OVS Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00499|binding|INFO|Removing lport a41f1fd9-25d7-4c85-96e2-18d396386762 ovn-installed in OVS Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.433 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a81c63d1-c197-41eb-93f7-be983c9ed80d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a41f1fd9-25d7-4c85-96e2-18d396386762) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.434 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.436 160439 INFO neutron.agent.ovn.metadata.agent [-] Port a41f1fd9-25d7-4c85-96e2-18d396386762 in datapath accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e unbound from our chassis#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.439 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.440 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[24903e6b-9ff5-4992-8265-fa3e6164b2b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.442 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.514 263258 INFO neutron.agent.dhcp.agent [None req-9bb08951-2c68-4617-9549-c8440aab8b5e - - - - - -] DHCP configuration for ports {'a41f1fd9-25d7-4c85-96e2-18d396386762', 'b87e3e64-b6cc-4f08-95a6-de593e031494'} is completed#033[00m Nov 23 05:04:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.613 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=bace1285-0dd7-4599-87e6-ae783a5add31, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1f8399a1-cba9-45be-90c1-08ef6a05cb9b, ip_allocation=immediate, mac_address=fa:16:3e:56:06:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:14Z, description=, dns_domain=, id=556c5c2e-c414-4271-8e77-d61a599ccbad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1079361768, port_security_enabled=True, project_id=d8633d61c76748a7a900f3c8cea84ef3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5645, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2519, status=ACTIVE, subnets=['4feaee24-e671-42a0-b0a6-e401ae11bc88'], tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=556c5c2e-c414-4271-8e77-d61a599ccbad, port_security_enabled=False, project_id=d8633d61c76748a7a900f3c8cea84ef3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2556, status=DOWN, tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:26Z on network 556c5c2e-c414-4271-8e77-d61a599ccbad#033[00m Nov 23 05:04:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:26.658 263258 INFO neutron.agent.linux.ip_lib [None req-807830b7-35e3-4070-8731-d3370c53b145 - - - - - -] Device tap8437b436-67 cannot be used as it has no MAC address#033[00m Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.686 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost kernel: device tap8437b436-67 entered promiscuous mode Nov 23 05:04:26 localhost NetworkManager[5975]: [1763892266.6931] manager: (tap8437b436-67): new Generic device (/org/freedesktop/NetworkManager/Devices/81) Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00500|binding|INFO|Claiming lport 8437b436-673f-4dcd-8ddd-08df6599f4ab for this chassis. Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00501|binding|INFO|8437b436-673f-4dcd-8ddd-08df6599f4ab: Claiming unknown Nov 23 05:04:26 localhost systemd-udevd[331541]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.705 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e6f3ebd-dc7a-479e-ba09-dcee6cc4d506, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8437b436-673f-4dcd-8ddd-08df6599f4ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.707 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8437b436-673f-4dcd-8ddd-08df6599f4ab in datapath 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 bound to our chassis#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.708 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:26.708 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c53f9f25-b45a-4384-86e0-2a283e88071e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00502|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab ovn-installed in OVS Nov 23 05:04:26 localhost ovn_controller[154788]: 2025-11-23T10:04:26Z|00503|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab up in Southbound Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.744 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost dnsmasq[331493]: exiting on receipt of SIGTERM Nov 23 05:04:26 localhost podman[331517]: 2025-11-23 10:04:26.753038763 +0000 UTC m=+0.106032425 container kill 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:04:26 localhost systemd[1]: libpod-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope: Deactivated successfully. Nov 23 05:04:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e170 e170: 6 total, 6 up, 6 in Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.799 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost podman[331555]: 2025-11-23 10:04:26.829030472 +0000 UTC m=+0.065570206 container died 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:26 localhost podman[331555]: 2025-11-23 10:04:26.855380016 +0000 UTC m=+0.091919760 container cleanup 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:26 localhost systemd[1]: libpod-conmon-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91.scope: Deactivated successfully. Nov 23 05:04:26 localhost podman[331557]: 2025-11-23 10:04:26.917530438 +0000 UTC m=+0.144204645 container remove 8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-accb7a47-b6e0-4b0c-81bc-2cbf0d4d0f4e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:04:26 localhost dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 1 addresses Nov 23 05:04:26 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host Nov 23 05:04:26 localhost podman[331574]: 2025-11-23 10:04:26.971860375 +0000 UTC m=+0.168859688 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:04:26 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts Nov 23 05:04:26 localhost kernel: device tapa41f1fd9-25 left promiscuous mode Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:26 localhost nova_compute[281952]: 2025-11-23 10:04:26.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.004 263258 INFO neutron.agent.dhcp.agent [None req-7a00c611-80b1-46ca-938b-33e21d680f4d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.005 263258 INFO neutron.agent.dhcp.agent [None req-7a00c611-80b1-46ca-938b-33e21d680f4d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:27 localhost systemd[1]: var-lib-containers-storage-overlay-0b39ba390b209ce21a98a569539df45ea77948795ea2108691055c1abc7e9596-merged.mount: Deactivated successfully. Nov 23 05:04:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e60e9084357b4f755d629ced20c89189e08d79f2807ef37347af09626036a91-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:27 localhost systemd[1]: run-netns-qdhcp\x2daccb7a47\x2db6e0\x2d4b0c\x2d81bc\x2d2cbf0d4d0f4e.mount: Deactivated successfully. Nov 23 05:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:04:27 localhost nova_compute[281952]: 2025-11-23 10:04:27.226 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.271 263258 INFO neutron.agent.dhcp.agent [None req-7157e83e-185d-404d-8c1e-3c98a58b564b - - - - - -] DHCP configuration for ports {'1f8399a1-cba9-45be-90c1-08ef6a05cb9b'} is completed#033[00m Nov 23 05:04:27 localhost systemd[1]: tmp-crun.r3wCs0.mount: Deactivated successfully. Nov 23 05:04:27 localhost ovn_controller[154788]: 2025-11-23T10:04:27Z|00504|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:27 localhost podman[331623]: 2025-11-23 10:04:27.312187057 +0000 UTC m=+0.107555872 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:27 localhost podman[331623]: 2025-11-23 10:04:27.328279681 +0000 UTC m=+0.123648486 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:27 localhost nova_compute[281952]: 2025-11-23 10:04:27.333 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:27 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:04:27 localhost podman[331625]: 2025-11-23 10:04:27.415786328 +0000 UTC m=+0.208277656 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:04:27 localhost podman[331625]: 2025-11-23 10:04:27.423429367 +0000 UTC m=+0.215920665 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:04:27 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:04:27 localhost podman[331696]: Nov 23 05:04:27 localhost podman[331696]: 2025-11-23 10:04:27.743294222 +0000 UTC m=+0.079949578 container create b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:04:27 localhost systemd[1]: Started libpod-conmon-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope. Nov 23 05:04:27 localhost podman[331696]: 2025-11-23 10:04:27.701017759 +0000 UTC m=+0.037673095 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:27 localhost systemd[1]: Started libcrun container. Nov 23 05:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b86b4f9fba21239c4ef1c266e92b520c83208906cd248f3a235bb0ae25f61f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e171 e171: 6 total, 6 up, 6 in Nov 23 05:04:27 localhost podman[331696]: 2025-11-23 10:04:27.81989319 +0000 UTC m=+0.156548506 container init b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:04:27 localhost podman[331696]: 2025-11-23 10:04:27.829119348 +0000 UTC m=+0.165774674 container start b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:04:27 localhost dnsmasq[331714]: started, version 2.85 cachesize 150 Nov 23 05:04:27 localhost dnsmasq[331714]: DNS service limited to local subnets Nov 23 05:04:27 localhost dnsmasq[331714]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:27 localhost dnsmasq[331714]: warning: no upstream servers configured Nov 23 05:04:27 localhost dnsmasq-dhcp[331714]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d Nov 23 05:04:27 localhost dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 0 addresses Nov 23 05:04:27 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host Nov 23 05:04:27 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts Nov 23 05:04:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:27.904 263258 INFO neutron.agent.dhcp.agent [None req-807830b7-35e3-4070-8731-d3370c53b145 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4b22a564-08c0-4921-9bac-e5cb75632958, ip_allocation=immediate, mac_address=fa:16:3e:27:c4:53, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:23Z, description=, dns_domain=, id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-543864074, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10320, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2551, status=ACTIVE, subnets=['a5d8fd54-c4bc-4361-bb44-fb672867e441'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:24Z, vlan_transparent=None, network_id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2557, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:26Z on network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6#033[00m Nov 23 05:04:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.034 263258 INFO neutron.agent.dhcp.agent [None req-ff340251-4043-4f77-b2d9-5fef3b0067bb - - - - - -] DHCP configuration for ports {'bc8ae273-d15b-44d3-8796-f23802f38110'} is completed#033[00m Nov 23 05:04:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e172 e172: 6 total, 6 up, 6 in Nov 23 05:04:28 localhost dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 1 addresses Nov 23 05:04:28 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host Nov 23 05:04:28 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts Nov 23 05:04:28 localhost podman[331733]: 2025-11-23 10:04:28.082573243 +0000 UTC m=+0.054654198 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:28 localhost systemd[1]: tmp-crun.T6F1Lh.mount: Deactivated successfully. Nov 23 05:04:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.284 263258 INFO neutron.agent.dhcp.agent [None req-36073ca9-2b12-435f-810c-3fca4b7b4fe8 - - - - - -] DHCP configuration for ports {'4b22a564-08c0-4921-9bac-e5cb75632958'} is completed#033[00m Nov 23 05:04:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 23 05:04:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:28.722 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=bace1285-0dd7-4599-87e6-ae783a5add31, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1f8399a1-cba9-45be-90c1-08ef6a05cb9b, ip_allocation=immediate, mac_address=fa:16:3e:56:06:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:14Z, description=, dns_domain=, id=556c5c2e-c414-4271-8e77-d61a599ccbad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1079361768, port_security_enabled=True, project_id=d8633d61c76748a7a900f3c8cea84ef3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=5645, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2519, status=ACTIVE, subnets=['4feaee24-e671-42a0-b0a6-e401ae11bc88'], tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:18Z, vlan_transparent=None, network_id=556c5c2e-c414-4271-8e77-d61a599ccbad, port_security_enabled=False, project_id=d8633d61c76748a7a900f3c8cea84ef3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2556, status=DOWN, tags=[], tenant_id=d8633d61c76748a7a900f3c8cea84ef3, updated_at=2025-11-23T10:04:26Z on network 556c5c2e-c414-4271-8e77-d61a599ccbad#033[00m Nov 23 05:04:28 localhost systemd[1]: tmp-crun.Q7r31W.mount: Deactivated successfully. Nov 23 05:04:28 localhost podman[331772]: 2025-11-23 10:04:28.96018903 +0000 UTC m=+0.078354742 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:28 localhost dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 1 addresses Nov 23 05:04:28 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host Nov 23 05:04:28 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts Nov 23 05:04:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.000 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:26Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4b22a564-08c0-4921-9bac-e5cb75632958, ip_allocation=immediate, mac_address=fa:16:3e:27:c4:53, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:23Z, description=, dns_domain=, id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-543864074, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10320, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2551, status=ACTIVE, subnets=['a5d8fd54-c4bc-4361-bb44-fb672867e441'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:24Z, vlan_transparent=None, network_id=1afb3f8c-e188-49e0-b864-15c6a95b0fb6, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2557, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:26Z on network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6#033[00m Nov 23 05:04:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e173 e173: 6 total, 6 up, 6 in Nov 23 05:04:29 localhost dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 1 addresses Nov 23 05:04:29 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host Nov 23 05:04:29 localhost systemd[1]: tmp-crun.NG8Xfl.mount: Deactivated successfully. Nov 23 05:04:29 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts Nov 23 05:04:29 localhost podman[331811]: 2025-11-23 10:04:29.208386586 +0000 UTC m=+0.055402660 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.237 263258 INFO neutron.agent.dhcp.agent [None req-4d2e13da-829e-4417-998e-40b0ddf3acf8 - - - - - -] DHCP configuration for ports {'1f8399a1-cba9-45be-90c1-08ef6a05cb9b'} is completed#033[00m Nov 23 05:04:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:04:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:04:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:04:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1866303281' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:04:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:29.487 263258 INFO neutron.agent.dhcp.agent [None req-5f7c91b5-6561-4ae1-966b-5fd577b64a7a - - - - - -] DHCP configuration for ports {'4b22a564-08c0-4921-9bac-e5cb75632958'} is completed#033[00m Nov 23 05:04:29 localhost openstack_network_exporter[242668]: ERROR 10:04:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:04:29 localhost openstack_network_exporter[242668]: ERROR 10:04:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:04:29 localhost openstack_network_exporter[242668]: ERROR 10:04:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:04:29 localhost openstack_network_exporter[242668]: ERROR 10:04:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:04:29 localhost openstack_network_exporter[242668]: Nov 23 05:04:29 localhost openstack_network_exporter[242668]: ERROR 10:04:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:04:29 localhost openstack_network_exporter[242668]: Nov 23 05:04:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 05:04:31 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3039705748' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 05:04:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e174 e174: 6 total, 6 up, 6 in Nov 23 05:04:31 localhost podman[331849]: 2025-11-23 10:04:31.502546223 +0000 UTC m=+0.066008889 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:04:31 localhost dnsmasq[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/addn_hosts - 0 addresses Nov 23 05:04:31 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/host Nov 23 05:04:31 localhost dnsmasq-dhcp[331054]: read /var/lib/neutron/dhcp/556c5c2e-c414-4271-8e77-d61a599ccbad/opts Nov 23 05:04:31 localhost ovn_controller[154788]: 2025-11-23T10:04:31Z|00505|binding|INFO|Releasing lport 39f9520b-e67f-48de-9178-ad0c9c37f804 from this chassis (sb_readonly=0) Nov 23 05:04:31 localhost ovn_controller[154788]: 2025-11-23T10:04:31Z|00506|binding|INFO|Setting lport 39f9520b-e67f-48de-9178-ad0c9c37f804 down in Southbound Nov 23 05:04:31 localhost nova_compute[281952]: 2025-11-23 10:04:31.701 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:31 localhost kernel: device tap39f9520b-e6 left promiscuous mode Nov 23 05:04:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:31.709 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-556c5c2e-c414-4271-8e77-d61a599ccbad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd8633d61c76748a7a900f3c8cea84ef3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de783514-b079-4988-843f-abee02f82863, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=39f9520b-e67f-48de-9178-ad0c9c37f804) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:31.711 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 39f9520b-e67f-48de-9178-ad0c9c37f804 in datapath 556c5c2e-c414-4271-8e77-d61a599ccbad unbound from our chassis#033[00m Nov 23 05:04:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:31.714 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 556c5c2e-c414-4271-8e77-d61a599ccbad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:04:31 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:31.715 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[4b06c757-f6e2-4585-a9e9-8dc38d9ae988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:31 localhost nova_compute[281952]: 2025-11-23 10:04:31.725 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e175 e175: 6 total, 6 up, 6 in Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.257 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:32.615 2 INFO neutron.agent.securitygroups_rpc [None req-fa166c8d-7d85-4b6f-949c-c3bef6490854 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:32.633 263258 INFO neutron.agent.linux.ip_lib [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Device tap2aa3f527-d9 cannot be used as it has no MAC address#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.658 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost kernel: device tap2aa3f527-d9 entered promiscuous mode Nov 23 05:04:32 localhost NetworkManager[5975]: [1763892272.6652] manager: (tap2aa3f527-d9): new Generic device (/org/freedesktop/NetworkManager/Devices/82) Nov 23 05:04:32 localhost ovn_controller[154788]: 2025-11-23T10:04:32Z|00507|binding|INFO|Claiming lport 2aa3f527-d9c7-4028-9d39-988134131b8f for this chassis. Nov 23 05:04:32 localhost ovn_controller[154788]: 2025-11-23T10:04:32Z|00508|binding|INFO|2aa3f527-d9c7-4028-9d39-988134131b8f: Claiming unknown Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.666 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost systemd-udevd[331882]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:32.688 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db97bbb5-462a-4c5e-a728-28756b766887, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2aa3f527-d9c7-4028-9d39-988134131b8f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:32 localhost ovn_controller[154788]: 2025-11-23T10:04:32Z|00509|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f ovn-installed in OVS Nov 23 05:04:32 localhost ovn_controller[154788]: 2025-11-23T10:04:32Z|00510|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f up in Southbound Nov 23 05:04:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:32.690 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa3f527-d9c7-4028-9d39-988134131b8f in datapath 43058c00-cde7-48e1-8e0e-eba5f793c5ac bound to our chassis#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:32.692 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43058c00-cde7-48e1-8e0e-eba5f793c5ac or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:32.693 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a0bc61d4-4a57-4f22-892d-c3c4e251cc47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.706 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.713 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.757 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:32 localhost nova_compute[281952]: 2025-11-23 10:04:32.785 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e176 e176: 6 total, 6 up, 6 in Nov 23 05:04:33 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:33.350 2 INFO neutron.agent.securitygroups_rpc [None req-4ffb6c00-2d92-41b8-843d-89b3bf39eddb 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:33 localhost podman[331937]: Nov 23 05:04:33 localhost podman[331937]: 2025-11-23 10:04:33.628528974 +0000 UTC m=+0.093011802 container create 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118) Nov 23 05:04:33 localhost systemd[1]: Started libpod-conmon-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope. Nov 23 05:04:33 localhost podman[331937]: 2025-11-23 10:04:33.581345423 +0000 UTC m=+0.045828291 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:33 localhost systemd[1]: Started libcrun container. Nov 23 05:04:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6027bb6fd654c4a054428fb77e5d5ffcb12dbfa91fa97836ded173ca06b20034/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:33 localhost podman[331937]: 2025-11-23 10:04:33.708595586 +0000 UTC m=+0.173078414 container init 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:04:33 localhost podman[331937]: 2025-11-23 10:04:33.720452153 +0000 UTC m=+0.184934981 container start 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:04:33 localhost dnsmasq[331955]: started, version 2.85 cachesize 150 Nov 23 05:04:33 localhost dnsmasq[331955]: DNS service limited to local subnets Nov 23 05:04:33 localhost dnsmasq[331955]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:33 localhost dnsmasq[331955]: warning: no upstream servers configured Nov 23 05:04:33 localhost dnsmasq-dhcp[331955]: DHCPv6, static leases only on 2001:db8:3::, lease time 1d Nov 23 05:04:33 localhost dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 0 addresses Nov 23 05:04:33 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host Nov 23 05:04:33 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts Nov 23 05:04:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:33.782 263258 INFO neutron.agent.dhcp.agent [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:32Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b15dac7f-40d8-4298-9412-2390f2fb7bcd, ip_allocation=immediate, mac_address=fa:16:3e:d4:02:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:30Z, description=, dns_domain=, id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-123647688, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2568, status=ACTIVE, subnets=['6842a301-3e2f-49bd-9e15-3d3ecfcc3da6'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:31Z, vlan_transparent=None, network_id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:32Z on network 43058c00-cde7-48e1-8e0e-eba5f793c5ac#033[00m Nov 23 05:04:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:33.922 263258 INFO neutron.agent.dhcp.agent [None req-18278e1a-c29a-4847-acbc-f48cf80aab68 - - - - - -] DHCP configuration for ports {'8ddabb54-5eeb-4483-ac10-999923ac4961'} is completed#033[00m Nov 23 05:04:33 localhost dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 1 addresses Nov 23 05:04:33 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host Nov 23 05:04:33 localhost podman[331972]: 2025-11-23 10:04:33.993974613 +0000 UTC m=+0.063753642 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:33 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts Nov 23 05:04:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.148 263258 INFO neutron.agent.dhcp.agent [None req-c1584d16-a8a6-41a5-babf-2c48d2c69e5c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:32Z, description=, device_id=6f4f97fd-43ff-46f9-88a9-6d80baeae99e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b15dac7f-40d8-4298-9412-2390f2fb7bcd, ip_allocation=immediate, mac_address=fa:16:3e:d4:02:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:30Z, description=, dns_domain=, id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-123647688, port_security_enabled=True, project_id=cd27ceae55c44d478998092e7554fd8a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2568, status=ACTIVE, subnets=['6842a301-3e2f-49bd-9e15-3d3ecfcc3da6'], tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:31Z, vlan_transparent=None, network_id=43058c00-cde7-48e1-8e0e-eba5f793c5ac, port_security_enabled=False, project_id=cd27ceae55c44d478998092e7554fd8a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2580, status=DOWN, tags=[], tenant_id=cd27ceae55c44d478998092e7554fd8a, updated_at=2025-11-23T10:04:32Z on network 43058c00-cde7-48e1-8e0e-eba5f793c5ac#033[00m Nov 23 05:04:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e177 e177: 6 total, 6 up, 6 in Nov 23 05:04:34 localhost sshd[332016]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:04:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.275 263258 INFO neutron.agent.dhcp.agent [None req-fe76be24-48c5-4d74-addb-987c5c113fdf - - - - - -] DHCP configuration for ports {'b15dac7f-40d8-4298-9412-2390f2fb7bcd'} is completed#033[00m Nov 23 05:04:34 localhost dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 1 addresses Nov 23 05:04:34 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host Nov 23 05:04:34 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts Nov 23 05:04:34 localhost podman[332010]: 2025-11-23 10:04:34.318499919 +0000 UTC m=+0.057839864 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:04:34 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:34.394 2 INFO neutron.agent.securitygroups_rpc [None req-e699b8a4-5f06-487a-a300-fd9ee1a788a2 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:34 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:34.533 263258 INFO neutron.agent.dhcp.agent [None req-44400d95-ade1-4ca4-ae1f-a5a5c0e2be84 - - - - - -] DHCP configuration for ports {'b15dac7f-40d8-4298-9412-2390f2fb7bcd'} is completed#033[00m Nov 23 05:04:34 localhost systemd[1]: tmp-crun.a905NT.mount: Deactivated successfully. Nov 23 05:04:35 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:35.170 2 INFO neutron.agent.securitygroups_rpc [None req-519a4a04-72f9-40b4-96af-e4987b6dbb80 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e178 e178: 6 total, 6 up, 6 in Nov 23 05:04:36 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e179 e179: 6 total, 6 up, 6 in Nov 23 05:04:37 localhost nova_compute[281952]: 2025-11-23 10:04:37.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:04:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:04:38 localhost podman[332034]: 2025-11-23 10:04:38.024072141 +0000 UTC m=+0.078010670 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:04:38 localhost podman[332034]: 2025-11-23 10:04:38.037063003 +0000 UTC m=+0.091001512 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:04:38 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:04:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e180 e180: 6 total, 6 up, 6 in Nov 23 05:04:38 localhost podman[332035]: 2025-11-23 10:04:38.091967957 +0000 UTC m=+0.143194965 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:38 localhost podman[332035]: 2025-11-23 10:04:38.105308359 +0000 UTC m=+0.156535407 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:04:38 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:04:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e181 e181: 6 total, 6 up, 6 in Nov 23 05:04:39 localhost dnsmasq[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/addn_hosts - 0 addresses Nov 23 05:04:39 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/host Nov 23 05:04:39 localhost podman[332094]: 2025-11-23 10:04:39.618621824 +0000 UTC m=+0.059579296 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:39 localhost dnsmasq-dhcp[331955]: read /var/lib/neutron/dhcp/43058c00-cde7-48e1-8e0e-eba5f793c5ac/opts Nov 23 05:04:39 localhost nova_compute[281952]: 2025-11-23 10:04:39.846 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:39 localhost kernel: device tap2aa3f527-d9 left promiscuous mode Nov 23 05:04:39 localhost ovn_controller[154788]: 2025-11-23T10:04:39Z|00511|binding|INFO|Releasing lport 2aa3f527-d9c7-4028-9d39-988134131b8f from this chassis (sb_readonly=0) Nov 23 05:04:39 localhost ovn_controller[154788]: 2025-11-23T10:04:39Z|00512|binding|INFO|Setting lport 2aa3f527-d9c7-4028-9d39-988134131b8f down in Southbound Nov 23 05:04:39 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:39.858 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:3::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43058c00-cde7-48e1-8e0e-eba5f793c5ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db97bbb5-462a-4c5e-a728-28756b766887, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2aa3f527-d9c7-4028-9d39-988134131b8f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:39 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:39.860 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa3f527-d9c7-4028-9d39-988134131b8f in datapath 43058c00-cde7-48e1-8e0e-eba5f793c5ac unbound from our chassis#033[00m Nov 23 05:04:39 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:39.862 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43058c00-cde7-48e1-8e0e-eba5f793c5ac or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:39 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:39.863 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[f33b1b69-7141-4c3c-9697-22f55d1f1685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:39 localhost nova_compute[281952]: 2025-11-23 10:04:39.878 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:39 localhost nova_compute[281952]: 2025-11-23 10:04:39.879 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:40 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:40.216 2 INFO neutron.agent.securitygroups_rpc [req-dd0696f0-72c9-4ee8-8c44-5e6e5f5d0a7a req-8f09e20d-9c7b-422f-bb6f-02a4d95509a5 5f7e9736cbc74ce4ac3de51c4ac84504 49ebd7a691dd4ea59ffbe9f5703e77e4 - - default default] Security group member updated ['d77fc436-3ab1-42e0-a52b-861d18fcc237']#033[00m Nov 23 05:04:40 localhost sshd[332117]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:04:40 localhost dnsmasq[331955]: exiting on receipt of SIGTERM Nov 23 05:04:40 localhost podman[332135]: 2025-11-23 10:04:40.673915553 +0000 UTC m=+0.062188675 container kill 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:04:40 localhost systemd[1]: libpod-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope: Deactivated successfully. Nov 23 05:04:40 localhost podman[332147]: 2025-11-23 10:04:40.745839469 +0000 UTC m=+0.059011188 container died 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:04:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:40 localhost podman[332147]: 2025-11-23 10:04:40.785773832 +0000 UTC m=+0.098945501 container cleanup 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:04:40 localhost systemd[1]: libpod-conmon-99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1.scope: Deactivated successfully. Nov 23 05:04:40 localhost podman[332151]: 2025-11-23 10:04:40.827687825 +0000 UTC m=+0.131684968 container remove 99af8ea5a14e7942e27467c3d937b100a68a6966f49b442a8a2ae70af2ad26d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43058c00-cde7-48e1-8e0e-eba5f793c5ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:04:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:41.267 263258 INFO neutron.agent.dhcp.agent [None req-109eb951-e910-417c-a956-ee0f2e889ebf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:41 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e182 e182: 6 total, 6 up, 6 in Nov 23 05:04:41 localhost systemd[1]: var-lib-containers-storage-overlay-6027bb6fd654c4a054428fb77e5d5ffcb12dbfa91fa97836ded173ca06b20034-merged.mount: Deactivated successfully. Nov 23 05:04:41 localhost systemd[1]: run-netns-qdhcp\x2d43058c00\x2dcde7\x2d48e1\x2d8e0e\x2deba5f793c5ac.mount: Deactivated successfully. Nov 23 05:04:41 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:41.730 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:41 localhost podman[240668]: time="2025-11-23T10:04:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:04:41 localhost podman[240668]: @ - - [23/Nov/2025:10:04:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159324 "" "Go-http-client/1.1" Nov 23 05:04:41 localhost podman[240668]: @ - - [23/Nov/2025:10:04:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1" Nov 23 05:04:42 localhost nova_compute[281952]: 2025-11-23 10:04:42.327 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:04:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:04:42 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:42.655 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:43 localhost ovn_controller[154788]: 2025-11-23T10:04:43Z|00513|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e183 e183: 6 total, 6 up, 6 in Nov 23 05:04:43 localhost nova_compute[281952]: 2025-11-23 10:04:43.079 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:04:44 localhost podman[332280]: 2025-11-23 10:04:44.629623721 +0000 UTC m=+0.058419451 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:04:44 localhost dnsmasq[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/addn_hosts - 0 addresses Nov 23 05:04:44 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/host Nov 23 05:04:44 localhost dnsmasq-dhcp[331714]: read /var/lib/neutron/dhcp/1afb3f8c-e188-49e0-b864-15c6a95b0fb6/opts Nov 23 05:04:44 localhost ovn_controller[154788]: 2025-11-23T10:04:44Z|00514|binding|INFO|Releasing lport 8437b436-673f-4dcd-8ddd-08df6599f4ab from this chassis (sb_readonly=0) Nov 23 05:04:44 localhost nova_compute[281952]: 2025-11-23 10:04:44.808 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:44 localhost kernel: device tap8437b436-67 left promiscuous mode Nov 23 05:04:44 localhost ovn_controller[154788]: 2025-11-23T10:04:44Z|00515|binding|INFO|Setting lport 8437b436-673f-4dcd-8ddd-08df6599f4ab down in Southbound Nov 23 05:04:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:44.828 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1afb3f8c-e188-49e0-b864-15c6a95b0fb6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e6f3ebd-dc7a-479e-ba09-dcee6cc4d506, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8437b436-673f-4dcd-8ddd-08df6599f4ab) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:44.831 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8437b436-673f-4dcd-8ddd-08df6599f4ab in datapath 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 unbound from our chassis#033[00m Nov 23 05:04:44 localhost nova_compute[281952]: 2025-11-23 10:04:44.831 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:44.833 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1afb3f8c-e188-49e0-b864-15c6a95b0fb6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:44.834 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[619c4494-baa6-47c7-be0b-420429eed29f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:44 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:44.989 2 INFO neutron.agent.securitygroups_rpc [None req-f2af9951-cbc0-4ee3-8964-70390f4dbee5 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:45 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e184 e184: 6 total, 6 up, 6 in Nov 23 05:04:45 localhost podman[332321]: 2025-11-23 10:04:45.746147994 +0000 UTC m=+0.056703819 container kill b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:04:45 localhost dnsmasq[331714]: exiting on receipt of SIGTERM Nov 23 05:04:45 localhost systemd[1]: libpod-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope: Deactivated successfully. Nov 23 05:04:45 localhost podman[332335]: 2025-11-23 10:04:45.813629567 +0000 UTC m=+0.051940566 container died b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:04:45 localhost podman[332335]: 2025-11-23 10:04:45.847569959 +0000 UTC m=+0.085880958 container cleanup b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:04:45 localhost systemd[1]: libpod-conmon-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457.scope: Deactivated successfully. Nov 23 05:04:45 localhost podman[332337]: 2025-11-23 10:04:45.898930076 +0000 UTC m=+0.128425919 container remove b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1afb3f8c-e188-49e0-b864-15c6a95b0fb6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118) Nov 23 05:04:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:45.926 263258 INFO neutron.agent.dhcp.agent [None req-293b986b-4bca-460f-b9c6-aacb82472f33 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:45.927 263258 INFO neutron.agent.dhcp.agent [None req-293b986b-4bca-460f-b9c6-aacb82472f33 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:45 localhost nova_compute[281952]: 2025-11-23 10:04:45.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:45.989 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:45 localhost ovn_controller[154788]: 2025-11-23T10:04:45Z|00516|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:45 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:45.991 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:04:46 localhost nova_compute[281952]: 2025-11-23 10:04:46.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e185 e185: 6 total, 6 up, 6 in Nov 23 05:04:46 localhost dnsmasq[331054]: exiting on receipt of SIGTERM Nov 23 05:04:46 localhost systemd[1]: libpod-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope: Deactivated successfully. Nov 23 05:04:46 localhost podman[332382]: 2025-11-23 10:04:46.241523276 +0000 UTC m=+0.060298918 container kill 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 23 05:04:46 localhost podman[332395]: 2025-11-23 10:04:46.309844554 +0000 UTC m=+0.055837633 container died 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:46 localhost podman[332395]: 2025-11-23 10:04:46.33861463 +0000 UTC m=+0.084607679 container cleanup 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:04:46 localhost systemd[1]: libpod-conmon-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72.scope: Deactivated successfully. Nov 23 05:04:46 localhost podman[332397]: 2025-11-23 10:04:46.390735301 +0000 UTC m=+0.128272275 container remove 2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-556c5c2e-c414-4271-8e77-d61a599ccbad, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:46.678 263258 INFO neutron.agent.dhcp.agent [None req-2312486c-d399-4809-92a6-da17acbf6088 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:46 localhost systemd[1]: var-lib-containers-storage-overlay-4b86b4f9fba21239c4ef1c266e92b520c83208906cd248f3a235bb0ae25f61f0-merged.mount: Deactivated successfully. Nov 23 05:04:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6fbf29f47e75aa593b8addb7ac5f5c6f0c6738c4f83c962e1e01d91f7a44457-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:46 localhost systemd[1]: run-netns-qdhcp\x2d1afb3f8c\x2de188\x2d49e0\x2db864\x2d15c6a95b0fb6.mount: Deactivated successfully. Nov 23 05:04:46 localhost systemd[1]: var-lib-containers-storage-overlay-ddc73133ee97a18b5d676ae71aebdcc91b4415d41e32db3c006d1af969d02b28-merged.mount: Deactivated successfully. Nov 23 05:04:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2289ae0a758e0f7ac9fa4eabc1927dcc03742d973c1a58e931e5d1f39c8e1d72-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:46 localhost systemd[1]: run-netns-qdhcp\x2d556c5c2e\x2dc414\x2d4271\x2d8e77\x2dd61a599ccbad.mount: Deactivated successfully. Nov 23 05:04:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e186 e186: 6 total, 6 up, 6 in Nov 23 05:04:47 localhost podman[332441]: 2025-11-23 10:04:47.161113407 +0000 UTC m=+0.056097201 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:47 localhost dnsmasq[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/addn_hosts - 0 addresses Nov 23 05:04:47 localhost systemd[1]: tmp-crun.vhYg3V.mount: Deactivated successfully. Nov 23 05:04:47 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/host Nov 23 05:04:47 localhost dnsmasq-dhcp[331198]: read /var/lib/neutron/dhcp/2fa353bd-02c5-4044-adba-b918030b206e/opts Nov 23 05:04:47 localhost ovn_controller[154788]: 2025-11-23T10:04:47Z|00517|binding|INFO|Releasing lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 from this chassis (sb_readonly=0) Nov 23 05:04:47 localhost kernel: device tap1ae8d8fe-51 left promiscuous mode Nov 23 05:04:47 localhost nova_compute[281952]: 2025-11-23 10:04:47.323 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:47 localhost ovn_controller[154788]: 2025-11-23T10:04:47Z|00518|binding|INFO|Setting lport 1ae8d8fe-5156-4993-b482-c4b01b921e85 down in Southbound Nov 23 05:04:47 localhost nova_compute[281952]: 2025-11-23 10:04:47.331 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:47 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:47.334 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fa353bd-02c5-4044-adba-b918030b206e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cd27ceae55c44d478998092e7554fd8a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14e80905-5b14-4ffd-9b3d-73c1c6bb812f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1ae8d8fe-5156-4993-b482-c4b01b921e85) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:47 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:47.336 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 1ae8d8fe-5156-4993-b482-c4b01b921e85 in datapath 2fa353bd-02c5-4044-adba-b918030b206e unbound from our chassis#033[00m Nov 23 05:04:47 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:47.337 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fa353bd-02c5-4044-adba-b918030b206e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:47 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:47.338 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[08e29a15-c222-4202-b15c-d7acbc4ff542]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:47 localhost nova_compute[281952]: 2025-11-23 10:04:47.345 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:04:47 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:04:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e187 e187: 6 total, 6 up, 6 in Nov 23 05:04:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:04:48 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3161485816' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:04:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:48 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:48.561 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:04:49 localhost systemd[1]: tmp-crun.9iONME.mount: Deactivated successfully. Nov 23 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:04:49 localhost podman[332465]: 2025-11-23 10:04:49.039350005 +0000 UTC m=+0.093812726 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350) Nov 23 05:04:49 localhost podman[332499]: 2025-11-23 10:04:49.126261923 +0000 UTC m=+0.067928117 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:04:49 localhost dnsmasq[331198]: exiting on receipt of SIGTERM Nov 23 05:04:49 localhost podman[332502]: 2025-11-23 10:04:49.170007771 +0000 UTC m=+0.105594302 container kill 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:04:49 localhost systemd[1]: libpod-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope: Deactivated successfully. Nov 23 05:04:49 localhost podman[332465]: 2025-11-23 10:04:49.177729514 +0000 UTC m=+0.232192195 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 23 05:04:49 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:04:49 localhost podman[332499]: 2025-11-23 10:04:49.21246537 +0000 UTC m=+0.154131554 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:04:49 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:04:49 localhost podman[332543]: 2025-11-23 10:04:49.246117444 +0000 UTC m=+0.055330407 container died 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:49 localhost podman[332543]: 2025-11-23 10:04:49.281590663 +0000 UTC m=+0.090803606 container remove 85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fa353bd-02c5-4044-adba-b918030b206e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:04:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:49.312 263258 INFO neutron.agent.dhcp.agent [None req-e41a1c8b-d447-4a66-8a16-b3359df5dcf5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:49 localhost podman[332501]: 2025-11-23 10:04:49.346065145 +0000 UTC m=+0.281137059 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:49 localhost systemd[1]: libpod-conmon-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614.scope: Deactivated successfully. Nov 23 05:04:49 localhost podman[332501]: 2025-11-23 10:04:49.417260519 +0000 UTC m=+0.352332433 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 05:04:49 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:49.418 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:04:49 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:04:49 localhost ovn_controller[154788]: 2025-11-23T10:04:49Z|00519|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:49 localhost nova_compute[281952]: 2025-11-23 10:04:49.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:50 localhost systemd[1]: var-lib-containers-storage-overlay-cf0b02e359ac1721f19ba2c6fc86978494592aa5ddd27403653b646396ae9cc9-merged.mount: Deactivated successfully. Nov 23 05:04:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85a5c48c1c4d2f3a2e05e3bc035a662ccee1e7369b5cf65e2c1f73f7f4c43614-userdata-shm.mount: Deactivated successfully. Nov 23 05:04:50 localhost systemd[1]: run-netns-qdhcp\x2d2fa353bd\x2d02c5\x2d4044\x2dadba\x2db918030b206e.mount: Deactivated successfully. Nov 23 05:04:51 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:51.249 2 INFO neutron.agent.securitygroups_rpc [None req-9f10832a-94d7-4e63-bc11-33234c92ec82 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:04:51 localhost sshd[332585]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:04:52 localhost nova_compute[281952]: 2025-11-23 10:04:52.374 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:52 localhost nova_compute[281952]: 2025-11-23 10:04:52.375 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 e188: 6 total, 6 up, 6 in Nov 23 05:04:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:53 localhost ovn_controller[154788]: 2025-11-23T10:04:53Z|00520|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:04:53 localhost nova_compute[281952]: 2025-11-23 10:04:53.976 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:55 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:55.993 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:04:56 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:56.159 263258 INFO neutron.agent.linux.ip_lib [None req-722cfbd4-d097-45d7-9db4-019a7a1127ef - - - - - -] Device tapb2609925-51 cannot be used as it has no MAC address#033[00m Nov 23 05:04:56 localhost nova_compute[281952]: 2025-11-23 10:04:56.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:56 localhost kernel: device tapb2609925-51 entered promiscuous mode Nov 23 05:04:56 localhost NetworkManager[5975]: [1763892296.1956] manager: (tapb2609925-51): new Generic device (/org/freedesktop/NetworkManager/Devices/83) Nov 23 05:04:56 localhost systemd-udevd[332597]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:04:56 localhost ovn_controller[154788]: 2025-11-23T10:04:56Z|00521|binding|INFO|Claiming lport b2609925-5134-4a13-ba79-45e02839b8f7 for this chassis. Nov 23 05:04:56 localhost ovn_controller[154788]: 2025-11-23T10:04:56Z|00522|binding|INFO|b2609925-5134-4a13-ba79-45e02839b8f7: Claiming unknown Nov 23 05:04:56 localhost nova_compute[281952]: 2025-11-23 10:04:56.235 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:56.251 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba426e81cfe149da986575955289d04b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e76005b-d8d7-445f-b11a-34d1e82ffc8b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2609925-5134-4a13-ba79-45e02839b8f7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:56.252 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b2609925-5134-4a13-ba79-45e02839b8f7 in datapath 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece bound to our chassis#033[00m Nov 23 05:04:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:56.253 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:04:56 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:56.254 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[7695fb0d-b711-46f1-ae6b-018c6507e391]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:56 localhost ovn_controller[154788]: 2025-11-23T10:04:56Z|00523|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 ovn-installed in OVS Nov 23 05:04:56 localhost ovn_controller[154788]: 2025-11-23T10:04:56Z|00524|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 up in Southbound Nov 23 05:04:56 localhost nova_compute[281952]: 2025-11-23 10:04:56.282 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:56 localhost nova_compute[281952]: 2025-11-23 10:04:56.315 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:56 localhost nova_compute[281952]: 2025-11-23 10:04:56.342 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:56 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:56.565 2 INFO neutron.agent.securitygroups_rpc [None req-b2e8c328-efaf-49d2-9816-397d8d6e979c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['213f9d65-3629-4053-acee-7e99a128b417']#033[00m Nov 23 05:04:57 localhost podman[332652]: Nov 23 05:04:57 localhost podman[332652]: 2025-11-23 10:04:57.206565867 +0000 UTC m=+0.086242129 container create 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:04:57 localhost systemd[1]: Started libpod-conmon-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope. Nov 23 05:04:57 localhost systemd[1]: Started libcrun container. Nov 23 05:04:57 localhost podman[332652]: 2025-11-23 10:04:57.163808539 +0000 UTC m=+0.043484801 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:04:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/deefc43655ec5efae4438c1bba8af5217f7f3c949c030f1c1d0f69134d512516/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:04:57 localhost podman[332652]: 2025-11-23 10:04:57.273651528 +0000 UTC m=+0.153327780 container init 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:04:57 localhost podman[332652]: 2025-11-23 10:04:57.284149414 +0000 UTC m=+0.163825666 container start 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:04:57 localhost dnsmasq[332670]: started, version 2.85 cachesize 150 Nov 23 05:04:57 localhost dnsmasq[332670]: DNS service limited to local subnets Nov 23 05:04:57 localhost dnsmasq[332670]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:04:57 localhost dnsmasq[332670]: warning: no upstream servers configured Nov 23 05:04:57 localhost dnsmasq-dhcp[332670]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:04:57 localhost dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 0 addresses Nov 23 05:04:57 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host Nov 23 05:04:57 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts Nov 23 05:04:57 localhost nova_compute[281952]: 2025-11-23 10:04:57.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:57 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:04:57.483 263258 INFO neutron.agent.dhcp.agent [None req-3f05d6d1-f56a-4bd9-9f0f-33675e6c048e - - - - - -] DHCP configuration for ports {'d0dc5460-4915-4418-9406-8337e0482cf3'} is completed#033[00m Nov 23 05:04:57 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:57.775 2 INFO neutron.agent.securitygroups_rpc [None req-d7ea7df2-10a0-4360-bfff-447d012be880 6f11688a49fb4deba83327b1cf6539b4 02d402d01a514bbd8ec5543d8bb9b97c - - default default] Security group rule updated ['76c5df30-fcbd-4316-84a0-0d549c3af78d']#033[00m Nov 23 05:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:04:58 localhost podman[332671]: 2025-11-23 10:04:58.034243089 +0000 UTC m=+0.090432845 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 05:04:58 localhost podman[332671]: 2025-11-23 10:04:58.050372325 +0000 UTC m=+0.106562071 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd) Nov 23 05:04:58 localhost podman[332672]: 2025-11-23 10:04:58.007980678 +0000 UTC m=+0.062771471 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:04:58 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:04:58 localhost podman[332672]: 2025-11-23 10:04:58.09234853 +0000 UTC m=+0.147139303 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:04:58 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:04:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:04:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:58.586 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5143a1d3-f63b-452c-a57f-85c07d0974c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d5a5bf10-b94f-4270-9b8b-f5b33fff78ea) old=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:04:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:58.588 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d5a5bf10-b94f-4270-9b8b-f5b33fff78ea in datapath 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe updated#033[00m Nov 23 05:04:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:58.591 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:04:58 localhost ovn_metadata_agent[160434]: 2025-11-23 10:04:58.592 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[07c62593-333d-4f21-bb26-28acd215318a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:04:58 localhost nova_compute[281952]: 2025-11-23 10:04:58.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:04:59 localhost neutron_sriov_agent[256124]: 2025-11-23 10:04:59.310 2 INFO neutron.agent.securitygroups_rpc [None req-2900c02d-4bae-4668-a3d0-a31f6942bf81 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['213f9d65-3629-4053-acee-7e99a128b417', '05c9de82-0c74-49cb-8524-43dd3dd47f37']#033[00m Nov 23 05:04:59 localhost openstack_network_exporter[242668]: ERROR 10:04:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:04:59 localhost openstack_network_exporter[242668]: ERROR 10:04:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:04:59 localhost openstack_network_exporter[242668]: ERROR 10:04:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:04:59 localhost openstack_network_exporter[242668]: ERROR 10:04:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:04:59 localhost openstack_network_exporter[242668]: Nov 23 05:04:59 localhost openstack_network_exporter[242668]: ERROR 10:04:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:04:59 localhost openstack_network_exporter[242668]: Nov 23 05:05:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:00.246 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:59Z, description=, device_id=e37909af-961f-4dfd-8d68-199ed54a6cf8, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=29773ea2-bb98-402d-9f2a-492d309a53db, ip_allocation=immediate, mac_address=fa:16:3e:71:7a:d7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:54Z, description=, dns_domain=, id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2072411749-network, port_security_enabled=True, project_id=ba426e81cfe149da986575955289d04b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42798, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2635, status=ACTIVE, subnets=['82fc8536-8876-439c-9d2b-ecbdd502ad8a'], tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:55Z, vlan_transparent=None, network_id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, port_security_enabled=False, project_id=ba426e81cfe149da986575955289d04b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2673, status=DOWN, tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:59Z on network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece#033[00m Nov 23 05:05:00 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:00.283 2 INFO neutron.agent.securitygroups_rpc [None req-31700e8d-00a6-42b0-834e-1388eab5f28c 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['05c9de82-0c74-49cb-8524-43dd3dd47f37']#033[00m Nov 23 05:05:00 localhost systemd[1]: tmp-crun.MN209f.mount: Deactivated successfully. Nov 23 05:05:00 localhost dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 1 addresses Nov 23 05:05:00 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host Nov 23 05:05:00 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts Nov 23 05:05:00 localhost podman[332731]: 2025-11-23 10:05:00.449721361 +0000 UTC m=+0.047567243 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 05:05:00 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:00.648 263258 INFO neutron.agent.dhcp.agent [None req-edf68da7-5c23-4761-8233-c8fbc7f02102 - - - - - -] DHCP configuration for ports {'29773ea2-bb98-402d-9f2a-492d309a53db'} is completed#033[00m Nov 23 05:05:01 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:01.821 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:04:59Z, description=, device_id=e37909af-961f-4dfd-8d68-199ed54a6cf8, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=29773ea2-bb98-402d-9f2a-492d309a53db, ip_allocation=immediate, mac_address=fa:16:3e:71:7a:d7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:04:54Z, description=, dns_domain=, id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-2072411749-network, port_security_enabled=True, project_id=ba426e81cfe149da986575955289d04b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42798, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2635, status=ACTIVE, subnets=['82fc8536-8876-439c-9d2b-ecbdd502ad8a'], tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:55Z, vlan_transparent=None, network_id=7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, port_security_enabled=False, project_id=ba426e81cfe149da986575955289d04b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2673, status=DOWN, tags=[], tenant_id=ba426e81cfe149da986575955289d04b, updated_at=2025-11-23T10:04:59Z on network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece#033[00m Nov 23 05:05:02 localhost dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 1 addresses Nov 23 05:05:02 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host Nov 23 05:05:02 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts Nov 23 05:05:02 localhost podman[332769]: 2025-11-23 10:05:02.041001446 +0000 UTC m=+0.056225605 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:05:02 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:02.310 263258 INFO neutron.agent.dhcp.agent [None req-55b5b2f1-1ff1-4f38-8fd8-1388baa78d74 - - - - - -] DHCP configuration for ports {'29773ea2-bb98-402d-9f2a-492d309a53db'} is completed#033[00m Nov 23 05:05:02 localhost nova_compute[281952]: 2025-11-23 10:05:02.472 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:02 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:02.807 2 INFO neutron.agent.securitygroups_rpc [None req-2a25adbb-406f-4488-9290-d86f8fa25b90 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['83b9eb37-6d54-417f-b8aa-c3bd6525a15a']#033[00m Nov 23 05:05:03 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:03.116 2 INFO neutron.agent.securitygroups_rpc [None req-48af5a2d-b1ea-400e-a467-c239dff497de 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['83b9eb37-6d54-417f-b8aa-c3bd6525a15a']#033[00m Nov 23 05:05:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:04 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:04.276 2 INFO neutron.agent.securitygroups_rpc [None req-068f4094-be4e-499c-ac72-326e6af4f870 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:04 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:04.542 2 INFO neutron.agent.securitygroups_rpc [None req-5e059f36-e08b-42c4-9c06-8e7d2c8a7a35 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:04 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:04.743 2 INFO neutron.agent.securitygroups_rpc [None req-4232445f-3fdb-4ab4-af76-a3c636901057 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:04 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:04.978 2 INFO neutron.agent.securitygroups_rpc [None req-bdf55031-0050-45a1-bc2b-6230ff544fa3 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:05 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:05.151 2 INFO neutron.agent.securitygroups_rpc [None req-efdb9984-7352-4eb4-bfb4-32226524bf47 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:05 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:05.488 2 INFO neutron.agent.securitygroups_rpc [None req-2d5ed31f-c175-4e95-8ecb-2dfb6c38fae5 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:05 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:05.546 2 INFO neutron.agent.securitygroups_rpc [None req-f12b045a-e1e3-435c-8190-d496fdcf5f2d 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['d1cc26af-765b-45fa-b447-8d13d7399069']#033[00m Nov 23 05:05:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:06.329 2 INFO neutron.agent.securitygroups_rpc [None req-b7a801b4-8c85-482c-af67-8b6642b94666 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:06 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e189 e189: 6 total, 6 up, 6 in Nov 23 05:05:06 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:06.867 2 INFO neutron.agent.securitygroups_rpc [None req-0eca1fa9-2850-4efd-aa96-731301b95192 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:07.207 2 INFO neutron.agent.securitygroups_rpc [None req-b76074b6-0a53-43ed-86f9-a06ad1dd7bfb 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.508 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.510 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.511 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:07 localhost nova_compute[281952]: 2025-11-23 10:05:07.512 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:07 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:07.585 2 INFO neutron.agent.securitygroups_rpc [None req-5025f2b5-3c8e-4a04-9825-a17ba040ab51 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['edf15f56-c99c-43ec-b56f-1cd94f59b2c7']#033[00m Nov 23 05:05:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:07.789 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5143a1d3-f63b-452c-a57f-85c07d0974c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d5a5bf10-b94f-4270-9b8b-f5b33fff78ea) old=Port_Binding(mac=['fa:16:3e:bb:d7:84 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76e6f4ab-630a-4c73-a560-1e6a5fffbdbe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6eb850a1541d4942b249428ef6092e5e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:07.791 160439 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d5a5bf10-b94f-4270-9b8b-f5b33fff78ea in datapath 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe updated#033[00m Nov 23 05:05:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:07.793 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76e6f4ab-630a-4c73-a560-1e6a5fffbdbe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:05:07 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:07.794 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d7a8f6-9a5c-4947-b3b3-c22484e248a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:08.555 2 INFO neutron.agent.securitygroups_rpc [None req-b418740f-9100-4b44-8438-3f54c0de85da 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['ec91c804-f6c3-4a65-9ba5-93d7c528c909']#033[00m Nov 23 05:05:08 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:08.603 2 INFO neutron.agent.securitygroups_rpc [None req-d6be62a7-1074-4184-82e5-b6c7eb9c713f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['d1cc26af-765b-45fa-b447-8d13d7399069', 'b62406ca-1ad0-471f-83b3-a7b86cb40552', '5041c083-f562-4221-8bac-acacd7a21e13']#033[00m Nov 23 05:05:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:05:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:05:08 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:08.997 263258 INFO neutron.agent.linux.ip_lib [None req-cd5cdb69-c35a-49c1-ac10-1c8ed85d09c5 - - - - - -] Device tapea48030b-3e cannot be used as it has no MAC address#033[00m Nov 23 05:05:09 localhost systemd[1]: tmp-crun.zTdsre.mount: Deactivated successfully. Nov 23 05:05:09 localhost podman[332791]: 2025-11-23 10:05:09.014111269 +0000 UTC m=+0.090818117 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:05:09 localhost podman[332791]: 2025-11-23 10:05:09.023620565 +0000 UTC m=+0.100327443 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:05:09 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:05:09 localhost podman[332792]: 2025-11-23 10:05:09.075848089 +0000 UTC m=+0.144856035 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:05:09 localhost nova_compute[281952]: 2025-11-23 10:05:09.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:09 localhost kernel: device tapea48030b-3e entered promiscuous mode Nov 23 05:05:09 localhost ovn_controller[154788]: 2025-11-23T10:05:09Z|00525|binding|INFO|Claiming lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 for this chassis. Nov 23 05:05:09 localhost ovn_controller[154788]: 2025-11-23T10:05:09Z|00526|binding|INFO|ea48030b-3e14-4f0f-ad85-6ef79695a3f3: Claiming unknown Nov 23 05:05:09 localhost NetworkManager[5975]: [1763892309.0855] manager: (tapea48030b-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/84) Nov 23 05:05:09 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:09.088 2 INFO neutron.agent.securitygroups_rpc [None req-7eb231ba-0d20-4837-8fea-3273f5df7e61 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['b62406ca-1ad0-471f-83b3-a7b86cb40552', '5041c083-f562-4221-8bac-acacd7a21e13']#033[00m Nov 23 05:05:09 localhost podman[332792]: 2025-11-23 10:05:09.087588062 +0000 UTC m=+0.156596008 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true) Nov 23 05:05:09 localhost systemd-udevd[332837]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:05:09 localhost nova_compute[281952]: 2025-11-23 10:05:09.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.096 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '472899094c04472c806243e76f122a0f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b884e9-74a6-4227-b3c7-e09f7c6545b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea48030b-3e14-4f0f-ad85-6ef79695a3f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.098 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ea48030b-3e14-4f0f-ad85-6ef79695a3f3 in datapath e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a bound to our chassis#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.100 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 545ac66f-28c5-4366-9ba8-8d6a25ddbb6f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.101 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.101 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[33dcdbfe-21f8-4bbd-ba42-69e916271f87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:09 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:05:09 localhost ovn_controller[154788]: 2025-11-23T10:05:09Z|00527|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 ovn-installed in OVS Nov 23 05:05:09 localhost ovn_controller[154788]: 2025-11-23T10:05:09Z|00528|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 up in Southbound Nov 23 05:05:09 localhost nova_compute[281952]: 2025-11-23 10:05:09.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost journal[230249]: ethtool ioctl error on tapea48030b-3e: No such device Nov 23 05:05:09 localhost nova_compute[281952]: 2025-11-23 10:05:09.151 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:09 localhost nova_compute[281952]: 2025-11-23 10:05:09.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:05:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:05:10 localhost podman[332910]: Nov 23 05:05:10 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:10.055 2 INFO neutron.agent.securitygroups_rpc [None req-dc86cddf-8096-4f55-8994-fc155127f219 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['5975f3c0-fffa-4893-9c5f-a50728456ba3']#033[00m Nov 23 05:05:10 localhost podman[332910]: 2025-11-23 10:05:10.059277363 +0000 UTC m=+0.091788836 container create dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:05:10 localhost podman[332910]: 2025-11-23 10:05:10.015587797 +0000 UTC m=+0.048099300 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:05:10 localhost nova_compute[281952]: 2025-11-23 10:05:10.142 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:10 localhost systemd[1]: Started libpod-conmon-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope. Nov 23 05:05:10 localhost systemd[1]: tmp-crun.UEmnac.mount: Deactivated successfully. Nov 23 05:05:10 localhost systemd[1]: Started libcrun container. Nov 23 05:05:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c01c854a331124ebc60335ecf0975a3084a9f216dced94c1e2f7c47b6d63d45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:05:10 localhost podman[332910]: 2025-11-23 10:05:10.196794255 +0000 UTC m=+0.229305688 container init dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:05:10 localhost podman[332910]: 2025-11-23 10:05:10.206423065 +0000 UTC m=+0.238934508 container start dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:05:10 localhost dnsmasq[332928]: started, version 2.85 cachesize 150 Nov 23 05:05:10 localhost dnsmasq[332928]: DNS service limited to local subnets Nov 23 05:05:10 localhost dnsmasq[332928]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:05:10 localhost dnsmasq[332928]: warning: no upstream servers configured Nov 23 05:05:10 localhost dnsmasq-dhcp[332928]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:05:10 localhost dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 0 addresses Nov 23 05:05:10 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host Nov 23 05:05:10 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts Nov 23 05:05:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:10.353 263258 INFO neutron.agent.dhcp.agent [None req-4cdcfaab-9ae5-4d66-ae48-3a70e3221ed3 - - - - - -] DHCP configuration for ports {'1e0fd39a-d46f-4ee1-a861-10a0e76d26c2'} is completed#033[00m Nov 23 05:05:10 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:10.394 2 INFO neutron.agent.securitygroups_rpc [None req-6f895a1f-dcef-4309-9299-e6c0da113106 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['5975f3c0-fffa-4893-9c5f-a50728456ba3']#033[00m Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.840 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.841 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30e318a7-c8cd-48b0-a2e9-22580d2cb513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.810605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59a21f0-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8e4916ee45204aa83830e5347fc65add68b211b43456cd6d623acc5a1cea4fed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.810605', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59a34a6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd3f46f8669f602fb6facec1b7b74986064d4fa0c879170bac3cb277def027e90'}]}, 'timestamp': '2025-11-23 10:05:10.841680', '_unique_id': 'f7740006a9fa48a09832577ff3102c7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.843 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b486a1bf-4393-4619-9d4c-bbd3cf593875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.844551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59ab5e8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8cfb74ed7113bacfe5692134079bfb81b40cdb2b5ad869c292e6e69e56d4fcc0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.844551', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59aca06-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd0d21f175a127e6f180b8bf52e3a9a7f0074c3852d93b22742e6f877aaed5d84'}]}, 'timestamp': '2025-11-23 10:05:10.845496', '_unique_id': '345ea9358e194c8ebc2c7d66e996a5d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.846 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.847 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.847 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd497f729-cf12-4389-a025-c74a6eaa2674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.847771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59b355e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'd765b8d41424bd32b1caa2713dbe2378c8845ae46bfdf8a6bd71591b39cda49f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.847771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59b45da-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'ffed5527062941412b8804800b9e588742392ff27782a261b852c207dc758654'}]}, 'timestamp': '2025-11-23 10:05:10.848665', '_unique_id': '5277f760838c4c30a0ea3fcda88a4eb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.849 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.850 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.862 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.863 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '815b8162-15a3-4178-b9b9-48069b720baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.850919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59d7904-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '53828373298b08615ae89c21efd1a51a108be8f6451299beb347a87133e01d91'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.850919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59d8af2-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '41c93ae05624d78e2f56eb60ef41abb9a0b63f5071ee914a0297baa290cf3194'}]}, 'timestamp': '2025-11-23 10:05:10.863536', '_unique_id': '85afb868c8584e419fdb1f6008a1310f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.864 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.865 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b52ae5c3-0d12-453d-9fa7-9df382eae2b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.865801', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e59e7020-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '635ea59849ac537a2d512cc0ef5e7aa98bc8370787ccd8dceed7028bcd5f9bb7'}]}, 'timestamp': '2025-11-23 10:05:10.869438', '_unique_id': '11a749c872b244e29467858ad59e1752'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76066f1b-5986-4ad8-93a7-bd81c08d44ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.872293', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e59ef1e4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'e48c472844f980739305b21fa2a8be2fc8d81c39756de029618958104b9fd2db'}]}, 'timestamp': '2025-11-23 10:05:10.872755', '_unique_id': 'f4761d0f6b3b44e18cb39a637c390c6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.873 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.875 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '907ce259-1c3e-4008-b56c-e9a72406f153', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.874934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59f58dc-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '5e3da032c6790ef716b267fb5373fb78562358ac3f015fdd778b9315568a6e5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.874934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59f6ade-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '8e53b4340c02f10917cc9a00014256ff844a182c555747e29af32a3a487a057c'}]}, 'timestamp': '2025-11-23 10:05:10.875825', '_unique_id': 'ee124783ac9847c484ba2a2c3e99f64e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.876 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.877 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.878 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1733a183-22c6-4786-923d-2bdb572a2011', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.878009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e59fd226-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '5a186de4c51c80447379ac433e4defd6aeb298c0c2694843cd6acbd69108b0f3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.878009', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e59fe522-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '55aadd9d516e13323e3c17822088baa4e9db7610b942debd40b9260f71803b46'}]}, 'timestamp': '2025-11-23 10:05:10.878988', '_unique_id': '1002f5924bfc46feb8ef55eb6d91afa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.879 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.881 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.881 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5acf63cb-808d-4613-8a79-62129796f565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.881183', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a04d5a-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '6028cfad1aaf2276c4a80a23fa7de7b753715605fa5b710cf28f9f4559b24827'}]}, 'timestamp': '2025-11-23 10:05:10.881650', '_unique_id': 'c5fdcc54b3a240fb9c55ef4c447f9873'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.882 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfd2b5e2-f654-4971-8532-acfd334c5eae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.883820', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a0b52e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '783477a4d46bfa91949cb5563812cf7ba743427df333961db9b4a341a8b77892'}]}, 'timestamp': '2025-11-23 10:05:10.884304', '_unique_id': '6cdf89c0cccd474dbbfc0dafab9ceeb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.886 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4109ca3-55c4-415d-96a1-76524af328fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.886509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a11cc6-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'c53f1cfb2c9c6a4fc65216f3430325c0f2157dc4b059f65e47a6b5455569e231'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.886509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a12e3c-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'eeed5ae09b790780c86a0fdfe3f22cb5e1a947dfa27cbbe3e23fab94f3fcda9f'}]}, 'timestamp': '2025-11-23 10:05:10.887377', '_unique_id': 'f1ded88487004b37800e70f514c90965'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.888 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.889 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:05:10 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:10.898 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:10Z, description=, device_id=da40d6c5-a255-43a6-9fbf-e2238a7bac71, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e, ip_allocation=immediate, mac_address=fa:16:3e:95:5c:c7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:06Z, description=, dns_domain=, id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1312470054-network, port_security_enabled=True, project_id=472899094c04472c806243e76f122a0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2738, status=ACTIVE, subnets=['ca34f827-438d-47f2-925f-0d6f76807026'], tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:07Z, vlan_transparent=None, network_id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, port_security_enabled=False, project_id=472899094c04472c806243e76f122a0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2758, status=DOWN, tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:10Z on network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a#033[00m Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 17880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96487d20-24b4-4e20-a458-3f6e75513dc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17880000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:05:10.889553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e5a46016-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.085355213, 'message_signature': 'c7ac373053712cd9c881ad1864f9ce2a45e0f3227c247bac4e5a99ad17478213'}]}, 'timestamp': '2025-11-23 10:05:10.908333', '_unique_id': '324a4cb4f28c42e6860b08d6d78015a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f77bca34-9131-44d3-8660-9492fe6f1fc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.911148', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a4e0b8-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '81a2f2c8e5af16473a36059e0132272857365929619ad093989eccd5973ed07b'}]}, 'timestamp': '2025-11-23 10:05:10.911639', '_unique_id': '9f7f89b68e494a52a0f6fe112a5746f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b297c1d-8569-4d40-963e-a328e0cb95db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.913730', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a5454e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'd80b2a004fc439eb7323471bbbfea8746dc4f14283e7152884e9046cae44d513'}]}, 'timestamp': '2025-11-23 10:05:10.914247', '_unique_id': '2bbe8d04bf6040ec9412f03abf6ca941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68e4c23c-5ec5-447e-aa4a-d9e0411ce1d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:05:10.916486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e5a5b150-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.085355213, 'message_signature': 'abe35384c83561b87efb893158f5fa6f6c3954c7eb50146ba241b5f66b8d7c45'}]}, 'timestamp': '2025-11-23 10:05:10.916994', '_unique_id': '5b6031355864442081664de2919a573f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e233b881-d6b4-4e0d-be44-449461b117f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.919278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a61ce4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': '52ce43ca2012de31c02264bf8ba9e9ef3b12d8a0befe54dd328428cb053d8c1a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.919278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a62e1e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.028558432, 'message_signature': 'e8142501eb5c0dc1ecad5a04af7ffdd4a32a7213ef9495dca755fe1727873854'}]}, 'timestamp': '2025-11-23 10:05:10.920145', '_unique_id': '1fd719e9277f4c308d926ed40a51761b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.922 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35a1ca82-4183-4115-bd57-790c0efcfb82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:05:10.922453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e5a6994e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': '94564295510375d450d797d8842800642a1d6b95e987250a303991f8e4b2fbf7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:05:10.922453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e5a6ab6e-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12253.988229197, 'message_signature': 'e46265e0ea4c8d8efa42e969fa6320b27557e6903769d9b6e62e063d9ebbea48'}]}, 'timestamp': '2025-11-23 10:05:10.923351', '_unique_id': '2745ad5979c2445b810a05a74afc8d4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2f80e2-5f2c-4608-a62b-96d0c143d022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.925813', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a71eb4-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '85d8059cc3984fe45d4063a8029983559a7d42d738a29cda85ebeccf431019ef'}]}, 'timestamp': '2025-11-23 10:05:10.926396', '_unique_id': 'b9a7e5b522ed45f8ae52a2308cf20aef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd12baa51-a95a-4d09-8523-73e08b16aba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.928261', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a77986-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '33311baa2b38ddea0d59ed3c3d4f062efa19bdcc0e731e52fba88820b6d930c4'}]}, 'timestamp': '2025-11-23 10:05:10.928582', '_unique_id': 'a2ed77f110f84b0a832ceabc173eef68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0fe9eb5-7130-4b34-8c98-103b71658426', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.930012', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a7be28-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': 'd573b08041567423bb22d5b4db0b9a9b88ae20b301af91f5d383c41d18dfac09'}]}, 'timestamp': '2025-11-23 10:05:10.930324', '_unique_id': 'ad06fdf9ffc346b08866dfc14068a3d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aa6766e-9ab5-4760-9f27-47664cd0d488', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:05:10.931599', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'e5a7fb18-c853-11f0-bde4-fa163e72a351', 'monotonic_time': 12254.043433441, 'message_signature': '0fd80ea625ed5ccd7ee42daa5783ff868044575fd9494cab3af34c3cf6513a82'}]}, 'timestamp': '2025-11-23 10:05:10.931920', '_unique_id': '4bd1b3ba958844f18ffffde106563e0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:05:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:05:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:05:11 localhost dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 1 addresses Nov 23 05:05:11 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host Nov 23 05:05:11 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts Nov 23 05:05:11 localhost podman[332945]: 2025-11-23 10:05:11.074070952 +0000 UTC m=+0.049993537 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118) Nov 23 05:05:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:11.338 263258 INFO neutron.agent.dhcp.agent [None req-a5227715-408f-4966-8187-45db71ac9c74 - - - - - -] DHCP configuration for ports {'d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e'} is completed#033[00m Nov 23 05:05:11 localhost podman[240668]: time="2025-11-23T10:05:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:05:11 localhost podman[240668]: @ - - [23/Nov/2025:10:05:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157510 "" "Go-http-client/1.1" Nov 23 05:05:11 localhost podman[240668]: @ - - [23/Nov/2025:10:05:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19723 "" "Go-http-client/1.1" Nov 23 05:05:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:12.279 2 INFO neutron.agent.securitygroups_rpc [None req-9a1758e4-5d44-475c-9640-9981332a110e 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['2786fa44-4779-49f0-84bb-2a9d4bed5cef']#033[00m Nov 23 05:05:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:12.357 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:10Z, description=, device_id=da40d6c5-a255-43a6-9fbf-e2238a7bac71, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e, ip_allocation=immediate, mac_address=fa:16:3e:95:5c:c7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:06Z, description=, dns_domain=, id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1312470054-network, port_security_enabled=True, project_id=472899094c04472c806243e76f122a0f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11483, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2738, status=ACTIVE, subnets=['ca34f827-438d-47f2-925f-0d6f76807026'], tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:07Z, vlan_transparent=None, network_id=e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, port_security_enabled=False, project_id=472899094c04472c806243e76f122a0f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2758, status=DOWN, tags=[], tenant_id=472899094c04472c806243e76f122a0f, updated_at=2025-11-23T10:05:10Z on network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a#033[00m Nov 23 05:05:12 localhost nova_compute[281952]: 2025-11-23 10:05:12.542 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:12 localhost dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 1 addresses Nov 23 05:05:12 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host Nov 23 05:05:12 localhost podman[332984]: 2025-11-23 10:05:12.562122926 +0000 UTC m=+0.068744521 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:05:12 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts Nov 23 05:05:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 e190: 6 total, 6 up, 6 in Nov 23 05:05:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:12.606 2 INFO neutron.agent.securitygroups_rpc [None req-52408a28-3173-4dbe-afae-862affbfdc2f 9fd6b9c1c244436ba8d5c98a9fcfa9c5 6eb850a1541d4942b249428ef6092e5e - - default default] Security group member updated ['cadd5356-9a8d-419a-ac04-589c2522a695']#033[00m Nov 23 05:05:12 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:12.799 2 INFO neutron.agent.securitygroups_rpc [None req-4dab38c7-a234-4065-b71c-d9440ee4c0cc 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['2786fa44-4779-49f0-84bb-2a9d4bed5cef']#033[00m Nov 23 05:05:12 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:12.866 263258 INFO neutron.agent.dhcp.agent [None req-1ee21828-7257-45e6-bf87-5d39c00e455d - - - - - -] DHCP configuration for ports {'d3e0991f-9369-457d-bbc5-2f4ff6aa5b2e'} is completed#033[00m Nov 23 05:05:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:14.059 2 INFO neutron.agent.securitygroups_rpc [None req-01487171-b358-4961-aa9c-b003fa4396a5 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:14.300 2 INFO neutron.agent.securitygroups_rpc [None req-9dfbfcf8-cd5f-4ab2-ad68-710c1a723a6d 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:14.500 2 INFO neutron.agent.securitygroups_rpc [None req-85d9c75a-2523-4c6c-82af-38821b506d6b 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:14.745 2 INFO neutron.agent.securitygroups_rpc [None req-82a95b4d-7b43-4234-8627-52e2e708ade0 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:14 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:14.943 2 INFO neutron.agent.securitygroups_rpc [None req-b4af07d0-a11f-4847-a586-dfc78999259e 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:15 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:15.147 2 INFO neutron.agent.securitygroups_rpc [None req-5c022b4a-ac2f-4704-91ea-0edb1d6cec16 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['d1c18a84-65e4-4a8f-9dbe-dec1719991b1']#033[00m Nov 23 05:05:15 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:15 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:15 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:15 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1156921035' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:15 localhost neutron_sriov_agent[256124]: 2025-11-23 10:05:15.781 2 INFO neutron.agent.securitygroups_rpc [None req-f781264c-3a54-454a-a5fe-8867df4ebfe6 131430ad8ac646268fcbca6e470c2ccc e604eae6a01f4975816ace0cec6e33f8 - - default default] Security group rule updated ['acd8c1db-c86a-40f9-91ab-30bd6f26d43e']#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.586 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.587 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.588 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:17 localhost nova_compute[281952]: 2025-11-23 10:05:17.589 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:19 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:19 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2599463425' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:05:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:05:20 localhost podman[333008]: 2025-11-23 10:05:20.034028413 +0000 UTC m=+0.077377542 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm) Nov 23 05:05:20 localhost podman[333008]: 2025-11-23 10:05:20.051363246 +0000 UTC m=+0.094712445 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc.) Nov 23 05:05:20 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:05:20 localhost podman[333007]: 2025-11-23 10:05:20.13618416 +0000 UTC m=+0.184643633 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 23 05:05:20 localhost podman[333007]: 2025-11-23 10:05:20.174579597 +0000 UTC m=+0.223039160 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:05:20 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:05:20 localhost podman[333006]: 2025-11-23 10:05:20.248331439 +0000 UTC m=+0.300142983 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:05:20 localhost podman[333006]: 2025-11-23 10:05:20.290309153 +0000 UTC m=+0.342120667 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:05:20 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.819 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.820 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.820 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.821 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:05:20 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:20.895 263258 INFO neutron.agent.linux.ip_lib [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Device tap16ecd221-18 cannot be used as it has no MAC address#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.909 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.909 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.910 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.910 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:20 localhost kernel: device tap16ecd221-18 entered promiscuous mode Nov 23 05:05:20 localhost systemd-udevd[333077]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:05:20 localhost NetworkManager[5975]: [1763892320.9297] manager: (tap16ecd221-18): new Generic device (/org/freedesktop/NetworkManager/Devices/85) Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.929 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:20 localhost ovn_controller[154788]: 2025-11-23T10:05:20Z|00529|binding|INFO|Claiming lport 16ecd221-1860-4329-907f-7a09499e197a for this chassis. Nov 23 05:05:20 localhost ovn_controller[154788]: 2025-11-23T10:05:20Z|00530|binding|INFO|16ecd221-1860-4329-907f-7a09499e197a: Claiming unknown Nov 23 05:05:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:20.946 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a088503b43e94251822e3c0e9006a74e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d787041c-8601-4ac7-8f35-9655cbd443c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=16ecd221-1860-4329-907f-7a09499e197a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:20.948 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 16ecd221-1860-4329-907f-7a09499e197a in datapath ce1d6d57-d515-4264-bb5e-663446a4e7d2 bound to our chassis#033[00m Nov 23 05:05:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:20.950 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce1d6d57-d515-4264-bb5e-663446a4e7d2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:05:20 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:20.952 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd9da97-4c93-496f-bb7a-b811220a5cf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:20 localhost ovn_controller[154788]: 2025-11-23T10:05:20Z|00531|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a ovn-installed in OVS Nov 23 05:05:20 localhost ovn_controller[154788]: 2025-11-23T10:05:20Z|00532|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a up in Southbound Nov 23 05:05:20 localhost nova_compute[281952]: 2025-11-23 10:05:20.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.046 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.381 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.399 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.399 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.400 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.400 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.401 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:21 localhost nova_compute[281952]: 2025-11-23 10:05:21.401 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:05:21 localhost podman[333132]: Nov 23 05:05:21 localhost podman[333132]: 2025-11-23 10:05:21.88002896 +0000 UTC m=+0.078582738 container create 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:05:21 localhost systemd[1]: Started libpod-conmon-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope. Nov 23 05:05:21 localhost podman[333132]: 2025-11-23 10:05:21.834276712 +0000 UTC m=+0.032830450 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:05:21 localhost systemd[1]: Started libcrun container. Nov 23 05:05:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abd93c1c89432cbaa2ac778bf0b96cd2b95c4cd061e47c6333d9a0aa2bd7ba03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:05:21 localhost podman[333132]: 2025-11-23 10:05:21.951521904 +0000 UTC m=+0.150075602 container init 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:05:21 localhost podman[333132]: 2025-11-23 10:05:21.957827064 +0000 UTC m=+0.156380752 container start 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:05:21 localhost dnsmasq[333150]: started, version 2.85 cachesize 150 Nov 23 05:05:21 localhost dnsmasq[333150]: DNS service limited to local subnets Nov 23 05:05:21 localhost dnsmasq[333150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:05:21 localhost dnsmasq[333150]: warning: no upstream servers configured Nov 23 05:05:21 localhost dnsmasq-dhcp[333150]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 23 05:05:21 localhost dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 0 addresses Nov 23 05:05:21 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host Nov 23 05:05:21 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts Nov 23 05:05:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.010 263258 INFO neutron.agent.dhcp.agent [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:20Z, description=, device_id=292abd12-0f11-4ce1-b8f1-44a94fd7bb57, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=aeaf694d-46ce-4fce-8629-1cf86abe6f8c, ip_allocation=immediate, mac_address=fa:16:3e:db:08:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:18Z, description=, dns_domain=, id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1513913109, port_security_enabled=True, project_id=a088503b43e94251822e3c0e9006a74e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2605, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2814, status=ACTIVE, subnets=['60594d6c-cb2d-4d1f-a4ca-7034647d3116'], tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:19Z, vlan_transparent=None, network_id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, port_security_enabled=False, project_id=a088503b43e94251822e3c0e9006a74e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2829, status=DOWN, tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:20Z on network ce1d6d57-d515-4264-bb5e-663446a4e7d2#033[00m Nov 23 05:05:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.064 263258 INFO neutron.agent.dhcp.agent [None req-667050e8-d47e-4830-885d-d7c9660deeb5 - - - - - -] DHCP configuration for ports {'b062f273-d4d6-4ada-a8e8-ed6a5f68566e'} is completed#033[00m Nov 23 05:05:22 localhost systemd[1]: tmp-crun.yM89WP.mount: Deactivated successfully. Nov 23 05:05:22 localhost podman[333168]: 2025-11-23 10:05:22.187367878 +0000 UTC m=+0.068015529 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:05:22 localhost dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 1 addresses Nov 23 05:05:22 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host Nov 23 05:05:22 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts Nov 23 05:05:22 localhost nova_compute[281952]: 2025-11-23 10:05:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.338 263258 INFO neutron.agent.dhcp.agent [None req-59dba442-d17d-4a0b-984d-3b50180a6fbc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:05:20Z, description=, device_id=292abd12-0f11-4ce1-b8f1-44a94fd7bb57, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=aeaf694d-46ce-4fce-8629-1cf86abe6f8c, ip_allocation=immediate, mac_address=fa:16:3e:db:08:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:05:18Z, description=, dns_domain=, id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1513913109, port_security_enabled=True, project_id=a088503b43e94251822e3c0e9006a74e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2605, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2814, status=ACTIVE, subnets=['60594d6c-cb2d-4d1f-a4ca-7034647d3116'], tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:19Z, vlan_transparent=None, network_id=ce1d6d57-d515-4264-bb5e-663446a4e7d2, port_security_enabled=False, project_id=a088503b43e94251822e3c0e9006a74e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2829, status=DOWN, tags=[], tenant_id=a088503b43e94251822e3c0e9006a74e, updated_at=2025-11-23T10:05:20Z on network ce1d6d57-d515-4264-bb5e-663446a4e7d2#033[00m Nov 23 05:05:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:22 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:22 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2974038667' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.448 263258 INFO neutron.agent.dhcp.agent [None req-1dc986a0-ecf1-47e1-bc88-f4b04ec83302 - - - - - -] DHCP configuration for ports {'aeaf694d-46ce-4fce-8629-1cf86abe6f8c'} is completed#033[00m Nov 23 05:05:22 localhost dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 1 addresses Nov 23 05:05:22 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host Nov 23 05:05:22 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts Nov 23 05:05:22 localhost podman[333204]: 2025-11-23 10:05:22.509282416 +0000 UTC m=+0.049744790 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:05:22 localhost nova_compute[281952]: 2025-11-23 10:05:22.627 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:22 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:22.841 263258 INFO neutron.agent.dhcp.agent [None req-0a07625c-9932-4fdb-99aa-828a4cfb891b - - - - - -] DHCP configuration for ports {'aeaf694d-46ce-4fce-8629-1cf86abe6f8c'} is completed#033[00m Nov 23 05:05:23 localhost systemd[1]: tmp-crun.g3OrDg.mount: Deactivated successfully. Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.235 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.236 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:05:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:05:23 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/597883095' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.700 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:05:23 localhost dnsmasq[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/addn_hosts - 0 addresses Nov 23 05:05:23 localhost podman[333261]: 2025-11-23 10:05:23.709066737 +0000 UTC m=+0.055533803 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:05:23 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/host Nov 23 05:05:23 localhost dnsmasq-dhcp[333150]: read /var/lib/neutron/dhcp/ce1d6d57-d515-4264-bb5e-663446a4e7d2/opts Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.772 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.773 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:05:23 localhost ovn_controller[154788]: 2025-11-23T10:05:23Z|00533|binding|INFO|Releasing lport 16ecd221-1860-4329-907f-7a09499e197a from this chassis (sb_readonly=0) Nov 23 05:05:23 localhost ovn_controller[154788]: 2025-11-23T10:05:23Z|00534|binding|INFO|Setting lport 16ecd221-1860-4329-907f-7a09499e197a down in Southbound Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.934 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:23 localhost kernel: device tap16ecd221-18 left promiscuous mode Nov 23 05:05:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:23.944 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce1d6d57-d515-4264-bb5e-663446a4e7d2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a088503b43e94251822e3c0e9006a74e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d787041c-8601-4ac7-8f35-9655cbd443c2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=16ecd221-1860-4329-907f-7a09499e197a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:23.946 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 16ecd221-1860-4329-907f-7a09499e197a in datapath ce1d6d57-d515-4264-bb5e-663446a4e7d2 unbound from our chassis#033[00m Nov 23 05:05:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:23.947 160439 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce1d6d57-d515-4264-bb5e-663446a4e7d2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 23 05:05:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:23.948 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[757e03c1-cef9-48e7-afca-497e276fefc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:23 localhost nova_compute[281952]: 2025-11-23 10:05:23.955 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.023 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11109MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.024 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.100 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.145 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:05:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e191 e191: 6 total, 6 up, 6 in Nov 23 05:05:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:05:24 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2996125607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.650 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.657 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.671 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.673 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:05:24 localhost nova_compute[281952]: 2025-11-23 10:05:24.674 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:05:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:24 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:24 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620567410' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e192 e192: 6 total, 6 up, 6 in Nov 23 05:05:27 localhost nova_compute[281952]: 2025-11-23 10:05:27.677 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:27 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:27 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3318661139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e193 e193: 6 total, 6 up, 6 in Nov 23 05:05:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:05:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e194 e194: 6 total, 6 up, 6 in Nov 23 05:05:29 localhost systemd[1]: tmp-crun.0crp4V.mount: Deactivated successfully. Nov 23 05:05:29 localhost dnsmasq[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/addn_hosts - 0 addresses Nov 23 05:05:29 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/host Nov 23 05:05:29 localhost dnsmasq-dhcp[332670]: read /var/lib/neutron/dhcp/7b3a7ba3-e63c-4e55-a6fc-444dc25aaece/opts Nov 23 05:05:29 localhost podman[333348]: 2025-11-23 10:05:29.157498659 +0000 UTC m=+0.074797303 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2) Nov 23 05:05:29 localhost podman[333312]: 2025-11-23 10:05:29.158584792 +0000 UTC m=+0.164274199 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:05:29 localhost podman[333312]: 2025-11-23 10:05:29.196274417 +0000 UTC m=+0.201963784 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 23 05:05:29 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:05:29 localhost podman[333313]: 2025-11-23 10:05:29.290323751 +0000 UTC m=+0.292743700 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:05:29 localhost podman[333313]: 2025-11-23 10:05:29.298706093 +0000 UTC m=+0.301126052 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:05:29 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:05:29 localhost ovn_controller[154788]: 2025-11-23T10:05:29Z|00535|binding|INFO|Releasing lport b2609925-5134-4a13-ba79-45e02839b8f7 from this chassis (sb_readonly=0) Nov 23 05:05:29 localhost ovn_controller[154788]: 2025-11-23T10:05:29Z|00536|binding|INFO|Setting lport b2609925-5134-4a13-ba79-45e02839b8f7 down in Southbound Nov 23 05:05:29 localhost kernel: device tapb2609925-51 left promiscuous mode Nov 23 05:05:29 localhost nova_compute[281952]: 2025-11-23 10:05:29.344 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:29.354 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ba426e81cfe149da986575955289d04b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e76005b-d8d7-445f-b11a-34d1e82ffc8b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2609925-5134-4a13-ba79-45e02839b8f7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:29.356 160439 INFO neutron.agent.ovn.metadata.agent [-] Port b2609925-5134-4a13-ba79-45e02839b8f7 in datapath 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece unbound from our chassis#033[00m Nov 23 05:05:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:29.358 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:05:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:29.359 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[a5fb5d14-41a3-42e7-94e8-7c05e9f79f0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:29 localhost nova_compute[281952]: 2025-11-23 10:05:29.365 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:29 localhost dnsmasq[333150]: exiting on receipt of SIGTERM Nov 23 05:05:29 localhost podman[333405]: 2025-11-23 10:05:29.480596462 +0000 UTC m=+0.061417101 container kill 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:05:29 localhost systemd[1]: libpod-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope: Deactivated successfully. Nov 23 05:05:29 localhost podman[333420]: 2025-11-23 10:05:29.550970632 +0000 UTC m=+0.054402020 container died 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2) Nov 23 05:05:29 localhost podman[333420]: 2025-11-23 10:05:29.583057108 +0000 UTC m=+0.086488456 container cleanup 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 23 05:05:29 localhost systemd[1]: libpod-conmon-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce.scope: Deactivated successfully. Nov 23 05:05:29 localhost podman[333421]: 2025-11-23 10:05:29.628019043 +0000 UTC m=+0.126423059 container remove 93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce1d6d57-d515-4264-bb5e-663446a4e7d2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:05:29 localhost nova_compute[281952]: 2025-11-23 10:05:29.669 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:05:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:29.875 263258 INFO neutron.agent.dhcp.agent [None req-10282890-f679-4351-a9be-ff391504968e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:29 localhost openstack_network_exporter[242668]: ERROR 10:05:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:05:29 localhost openstack_network_exporter[242668]: ERROR 10:05:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:05:29 localhost openstack_network_exporter[242668]: ERROR 10:05:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:05:29 localhost openstack_network_exporter[242668]: ERROR 10:05:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:05:29 localhost openstack_network_exporter[242668]: Nov 23 05:05:29 localhost openstack_network_exporter[242668]: ERROR 10:05:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:05:29 localhost openstack_network_exporter[242668]: Nov 23 05:05:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e195 e195: 6 total, 6 up, 6 in Nov 23 05:05:30 localhost systemd[1]: tmp-crun.xy8Mmi.mount: Deactivated successfully. Nov 23 05:05:30 localhost systemd[1]: var-lib-containers-storage-overlay-abd93c1c89432cbaa2ac778bf0b96cd2b95c4cd061e47c6333d9a0aa2bd7ba03-merged.mount: Deactivated successfully. Nov 23 05:05:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93756a51dedf5bf60f605d70f18593769214beff2c52582589d2c36160575fce-userdata-shm.mount: Deactivated successfully. Nov 23 05:05:30 localhost systemd[1]: run-netns-qdhcp\x2dce1d6d57\x2dd515\x2d4264\x2dbb5e\x2d663446a4e7d2.mount: Deactivated successfully. Nov 23 05:05:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:30.539 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:30 localhost ovn_controller[154788]: 2025-11-23T10:05:30Z|00537|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:05:30 localhost nova_compute[281952]: 2025-11-23 10:05:30.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:31 localhost ovn_controller[154788]: 2025-11-23T10:05:31Z|00538|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:05:31 localhost nova_compute[281952]: 2025-11-23 10:05:31.246 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:31 localhost dnsmasq[332670]: exiting on receipt of SIGTERM Nov 23 05:05:31 localhost systemd[1]: libpod-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope: Deactivated successfully. Nov 23 05:05:31 localhost podman[333466]: 2025-11-23 10:05:31.887400313 +0000 UTC m=+0.061603986 container kill 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:05:31 localhost podman[333480]: 2025-11-23 10:05:31.956514525 +0000 UTC m=+0.058750001 container died 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 23 05:05:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf-userdata-shm.mount: Deactivated successfully. Nov 23 05:05:31 localhost podman[333480]: 2025-11-23 10:05:31.987764647 +0000 UTC m=+0.090000083 container cleanup 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:05:31 localhost systemd[1]: libpod-conmon-2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf.scope: Deactivated successfully. Nov 23 05:05:32 localhost podman[333487]: 2025-11-23 10:05:32.038377542 +0000 UTC m=+0.126826073 container remove 2690246c3ffa3d14c567ff88e37790b15764de662aca539930b0e115ba34f6cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7b3a7ba3-e63c-4e55-a6fc-444dc25aaece, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 05:05:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:32.142 263258 INFO neutron.agent.dhcp.agent [None req-25b19362-8d07-456c-bea7-27fcff6e65aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:32 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e196 e196: 6 total, 6 up, 6 in Nov 23 05:05:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:32.295 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:32 localhost nova_compute[281952]: 2025-11-23 10:05:32.719 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:32 localhost nova_compute[281952]: 2025-11-23 10:05:32.720 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:32 localhost systemd[1]: var-lib-containers-storage-overlay-deefc43655ec5efae4438c1bba8af5217f7f3c949c030f1c1d0f69134d512516-merged.mount: Deactivated successfully. Nov 23 05:05:32 localhost systemd[1]: run-netns-qdhcp\x2d7b3a7ba3\x2de63c\x2d4e55\x2da6fc\x2d444dc25aaece.mount: Deactivated successfully. Nov 23 05:05:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e197 e197: 6 total, 6 up, 6 in Nov 23 05:05:33 localhost dnsmasq[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/addn_hosts - 0 addresses Nov 23 05:05:33 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/host Nov 23 05:05:33 localhost dnsmasq-dhcp[332928]: read /var/lib/neutron/dhcp/e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a/opts Nov 23 05:05:33 localhost podman[333527]: 2025-11-23 10:05:33.283066015 +0000 UTC m=+0.060570696 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:05:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:33 localhost ovn_controller[154788]: 2025-11-23T10:05:33Z|00539|binding|INFO|Releasing lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 from this chassis (sb_readonly=0) Nov 23 05:05:33 localhost kernel: device tapea48030b-3e left promiscuous mode Nov 23 05:05:33 localhost ovn_controller[154788]: 2025-11-23T10:05:33Z|00540|binding|INFO|Setting lport ea48030b-3e14-4f0f-ad85-6ef79695a3f3 down in Southbound Nov 23 05:05:33 localhost nova_compute[281952]: 2025-11-23 10:05:33.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:33.479 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '472899094c04472c806243e76f122a0f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4b884e9-74a6-4227-b3c7-e09f7c6545b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ea48030b-3e14-4f0f-ad85-6ef79695a3f3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:33.481 160439 INFO neutron.agent.ovn.metadata.agent [-] Port ea48030b-3e14-4f0f-ad85-6ef79695a3f3 in datapath e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a unbound from our chassis#033[00m Nov 23 05:05:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:33.482 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:05:33 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:33.483 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[49c4bdc5-f2e9-409e-b9d6-f039fee9c6d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:05:33 localhost nova_compute[281952]: 2025-11-23 10:05:33.494 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e198 e198: 6 total, 6 up, 6 in Nov 23 05:05:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e199 e199: 6 total, 6 up, 6 in Nov 23 05:05:35 localhost ovn_controller[154788]: 2025-11-23T10:05:35Z|00541|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:05:35 localhost nova_compute[281952]: 2025-11-23 10:05:35.322 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:36 localhost dnsmasq[332928]: exiting on receipt of SIGTERM Nov 23 05:05:36 localhost systemd[1]: tmp-crun.fRByvl.mount: Deactivated successfully. Nov 23 05:05:36 localhost systemd[1]: libpod-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope: Deactivated successfully. Nov 23 05:05:36 localhost podman[333564]: 2025-11-23 10:05:36.200740943 +0000 UTC m=+0.059749611 container kill dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:05:36 localhost podman[333578]: 2025-11-23 10:05:36.266850185 +0000 UTC m=+0.059189404 container died dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:05:36 localhost podman[333578]: 2025-11-23 10:05:36.295003103 +0000 UTC m=+0.087342252 container cleanup dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:05:36 localhost systemd[1]: libpod-conmon-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f.scope: Deactivated successfully. Nov 23 05:05:36 localhost podman[333585]: 2025-11-23 10:05:36.352701751 +0000 UTC m=+0.127084860 container remove dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e8c74d81-0df4-43eb-8b7e-f7ad7da5cf4a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:05:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:36.376 263258 INFO neutron.agent.dhcp.agent [None req-d327e57b-6666-40b3-b749-83f50dc9721d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:36 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:05:36.426 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:05:37 localhost systemd[1]: var-lib-containers-storage-overlay-0c01c854a331124ebc60335ecf0975a3084a9f216dced94c1e2f7c47b6d63d45-merged.mount: Deactivated successfully. Nov 23 05:05:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dba578f7346d754f40f7f4a6910cbe82c2619a681e02e9d670d10d7b927d6d3f-userdata-shm.mount: Deactivated successfully. Nov 23 05:05:37 localhost systemd[1]: run-netns-qdhcp\x2de8c74d81\x2d0df4\x2d43eb\x2d8b7e\x2df7ad7da5cf4a.mount: Deactivated successfully. Nov 23 05:05:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e200 e200: 6 total, 6 up, 6 in Nov 23 05:05:37 localhost nova_compute[281952]: 2025-11-23 10:05:37.756 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e201 e201: 6 total, 6 up, 6 in Nov 23 05:05:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e202 e202: 6 total, 6 up, 6 in Nov 23 05:05:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:05:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:05:40 localhost podman[333607]: 2025-11-23 10:05:40.027996902 +0000 UTC m=+0.081156426 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:05:40 localhost podman[333608]: 2025-11-23 10:05:40.096141044 +0000 UTC m=+0.148149414 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118) Nov 23 05:05:40 localhost podman[333608]: 2025-11-23 10:05:40.111366562 +0000 UTC m=+0.163374962 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:05:40 localhost podman[333607]: 2025-11-23 10:05:40.111682412 +0000 UTC m=+0.164841936 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:05:40 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:05:40 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:05:41 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e203 e203: 6 total, 6 up, 6 in Nov 23 05:05:41 localhost podman[240668]: time="2025-11-23T10:05:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:05:41 localhost podman[240668]: @ - - [23/Nov/2025:10:05:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:05:41 localhost podman[240668]: @ - - [23/Nov/2025:10:05:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18782 "" "Go-http-client/1.1" Nov 23 05:05:42 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e204 e204: 6 total, 6 up, 6 in Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.758 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.793 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:42 localhost nova_compute[281952]: 2025-11-23 10:05:42.794 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e205 e205: 6 total, 6 up, 6 in Nov 23 05:05:43 localhost podman[333760]: 2025-11-23 10:05:43.26366251 +0000 UTC m=+0.102132578 container exec 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.buildah.version=1.33.12) Nov 23 05:05:43 localhost podman[333760]: 2025-11-23 10:05:43.410437681 +0000 UTC m=+0.248907749 container exec_died 2c583037be189f447885042ee4835e4c157c2410e99e19ee493251b1c8c46bee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-46550e70-79cb-5f55-bf6d-1204b97e083b-crash-np0005532585, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, ceph=True, architecture=x86_64, release=553) Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:43 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:43 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/149050981' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:43 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:43 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2425458825' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e206 e206: 6 total, 6 up, 6 in Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:44 localhost sshd[333935]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:05:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:46.115 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:05:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:46.119 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:05:46 localhost nova_compute[281952]: 2025-11-23 10:05:46.159 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e207 e207: 6 total, 6 up, 6 in Nov 23 05:05:46 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532584.localdomain to 836.6M Nov 23 05:05:46 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532585.localdomain to 836.6M Nov 23 05:05:46 localhost ceph-mon[300199]: Adjusting osd_memory_target on np0005532586.localdomain to 836.6M Nov 23 05:05:46 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532584.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 05:05:46 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532586.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Nov 23 05:05:46 localhost ceph-mon[300199]: Unable to set osd_memory_target on np0005532585.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 23 05:05:47 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e208 e208: 6 total, 6 up, 6 in Nov 23 05:05:47 localhost nova_compute[281952]: 2025-11-23 10:05:47.795 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:47 localhost nova_compute[281952]: 2025-11-23 10:05:47.800 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e209 e209: 6 total, 6 up, 6 in Nov 23 05:05:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 23 05:05:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:05:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:05:49 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:05:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:05:51 localhost podman[333971]: 2025-11-23 10:05:51.033928055 +0000 UTC m=+0.082410823 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9) Nov 23 05:05:51 localhost podman[333970]: 2025-11-23 10:05:51.076063775 +0000 UTC m=+0.127208373 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 05:05:51 localhost podman[333971]: 2025-11-23 10:05:51.101797681 +0000 UTC m=+0.150280499 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=) Nov 23 05:05:51 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:05:51 localhost podman[333970]: 2025-11-23 10:05:51.157311422 +0000 UTC m=+0.208456090 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 05:05:51 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:05:51 localhost podman[333969]: 2025-11-23 10:05:51.245133468 +0000 UTC m=+0.298140362 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 23 05:05:51 localhost podman[333969]: 2025-11-23 10:05:51.275460932 +0000 UTC m=+0.328467836 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118) Nov 23 05:05:51 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:05:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:05:51 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:05:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:05:51 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1438458364' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:05:52 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 23 05:05:52 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 23 05:05:52 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.835 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:52 localhost nova_compute[281952]: 2025-11-23 10:05:52.836 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e210 e210: 6 total, 6 up, 6 in Nov 23 05:05:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e211 e211: 6 total, 6 up, 6 in Nov 23 05:05:55 localhost ovn_metadata_agent[160434]: 2025-11-23 10:05:55.121 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:05:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 23 05:05:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:05:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/23db4718-314d-42ce-b54b-c4702b2fa362/bb499d3a-704f-460c-a9a3-9c6909b5fa80", "osd", "allow rw pool=manila_data namespace=fsvolumens_23db4718-314d-42ce-b54b-c4702b2fa362", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.837 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.839 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.882 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:05:57 localhost nova_compute[281952]: 2025-11-23 10:05:57.883 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:05:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:05:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e212 e212: 6 total, 6 up, 6 in Nov 23 05:05:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 23 05:05:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 23 05:05:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Nov 23 05:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:05:59 localhost openstack_network_exporter[242668]: ERROR 10:05:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:05:59 localhost openstack_network_exporter[242668]: ERROR 10:05:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:05:59 localhost openstack_network_exporter[242668]: ERROR 10:05:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:05:59 localhost openstack_network_exporter[242668]: ERROR 10:05:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:05:59 localhost openstack_network_exporter[242668]: Nov 23 05:05:59 localhost openstack_network_exporter[242668]: ERROR 10:05:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:05:59 localhost openstack_network_exporter[242668]: Nov 23 05:06:00 localhost podman[334033]: 2025-11-23 10:06:00.047675767 +0000 UTC m=+0.098481078 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:06:00 localhost podman[334033]: 2025-11-23 10:06:00.05976449 +0000 UTC m=+0.110569811 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:06:00 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:06:00 localhost systemd[1]: tmp-crun.N0meMN.mount: Deactivated successfully. Nov 23 05:06:00 localhost podman[334032]: 2025-11-23 10:06:00.146529784 +0000 UTC m=+0.200973395 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:06:00 localhost podman[334032]: 2025-11-23 10:06:00.159255988 +0000 UTC m=+0.213699639 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 23 05:06:00 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:06:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:06:02 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:06:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:06:02 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1789904746' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.886 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.888 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:02 localhost nova_compute[281952]: 2025-11-23 10:06:02.916 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e213 e213: 6 total, 6 up, 6 in Nov 23 05:06:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.547517) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363547588, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2879, "num_deletes": 276, "total_data_size": 5425586, "memory_usage": 5595936, "flush_reason": "Manual Compaction"} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363576276, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3552406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26982, "largest_seqno": 29856, "table_properties": {"data_size": 3540546, "index_size": 7725, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27806, "raw_average_key_size": 22, "raw_value_size": 3515998, "raw_average_value_size": 2879, "num_data_blocks": 323, "num_entries": 1221, "num_filter_entries": 1221, "num_deletions": 276, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892236, "oldest_key_time": 1763892236, "file_creation_time": 1763892363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 28824 microseconds, and 9925 cpu microseconds. Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.576342) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3552406 bytes OK Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.576372) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579462) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579487) EVENT_LOG_v1 {"time_micros": 1763892363579480, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.579513) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5412117, prev total WAL file size 5412117, number of live WAL files 2. Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.580631) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3469KB)], [48(14MB)] Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363580686, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 19054683, "oldest_snapshot_seqno": -1} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13245 keys, 17876505 bytes, temperature: kUnknown Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363670047, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17876505, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17800781, "index_size": 41511, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33157, "raw_key_size": 356686, "raw_average_key_size": 26, "raw_value_size": 17574981, "raw_average_value_size": 1326, "num_data_blocks": 1553, "num_entries": 13245, "num_filter_entries": 13245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892363, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.670358) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17876505 bytes Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.672351) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.0 rd, 199.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 14.8 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(10.4) write-amplify(5.0) OK, records in: 13810, records dropped: 565 output_compression: NoCompression Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.672386) EVENT_LOG_v1 {"time_micros": 1763892363672372, "job": 28, "event": "compaction_finished", "compaction_time_micros": 89459, "compaction_time_cpu_micros": 52678, "output_level": 6, "num_output_files": 1, "total_output_size": 17876505, "num_input_records": 13810, "num_output_records": 13245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363673155, "job": 28, "event": "table_file_deletion", "file_number": 50} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892363675721, "job": 28, "event": "table_file_deletion", "file_number": 48} Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.580544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:03 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:06:03.675867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:06:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 23 05:06:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 23 05:06:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Nov 23 05:06:05 localhost sshd[334073]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:06:06 localhost ovn_controller[154788]: 2025-11-23T10:06:06Z|00542|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.917 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.919 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.920 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.953 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.954 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:07 localhost nova_compute[281952]: 2025-11-23 10:06:07.957 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 e214: 6 total, 6 up, 6 in Nov 23 05:06:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:06:08 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:06:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:06:08 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1601258234' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:06:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:09.303 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:06:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:06:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:06:10 localhost sshd[334075]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:06:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:06:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:06:11 localhost podman[334077]: 2025-11-23 10:06:11.040135561 +0000 UTC m=+0.087039923 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:06:11 localhost podman[334077]: 2025-11-23 10:06:11.076569368 +0000 UTC m=+0.123473770 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:06:11 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:06:11 localhost podman[334076]: 2025-11-23 10:06:11.081638751 +0000 UTC m=+0.132142741 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:06:11 localhost podman[334076]: 2025-11-23 10:06:11.165307202 +0000 UTC m=+0.215811162 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:06:11 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:06:11 localhost podman[240668]: time="2025-11-23T10:06:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:06:11 localhost podman[240668]: @ - - [23/Nov/2025:10:06:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:06:11 localhost podman[240668]: @ - - [23/Nov/2025:10:06:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18787 "" "Go-http-client/1.1" Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.984 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.986 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.987 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:12 localhost nova_compute[281952]: 2025-11-23 10:06:12.990 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:13 localhost sshd[334118]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:06:14 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 23 05:06:17 localhost nova_compute[281952]: 2025-11-23 10:06:17.988 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 23 05:06:18 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1442410396' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.324 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.324 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.325 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.325 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-710186636", "format": "json"} : dispatch Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-710186636", "caps": ["mds", "allow rw path=/volumes/_nogroup/b0303be3-5e23-424d-935b-a23f10085dfe/5ae76225-892a-444c-90ca-4662bf6a5b64", "osd", "allow rw pool=manila_data namespace=fsvolumens_b0303be3-5e23-424d-935b-a23f10085dfe", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-710186636", "caps": ["mds", "allow rw path=/volumes/_nogroup/b0303be3-5e23-424d-935b-a23f10085dfe/5ae76225-892a-444c-90ca-4662bf6a5b64", "osd", "allow rw pool=manila_data namespace=fsvolumens_b0303be3-5e23-424d-935b-a23f10085dfe", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-710186636", "format": "json"} : dispatch Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-710186636"} : dispatch Nov 23 05:06:19 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-710186636"}]': finished Nov 23 05:06:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e215 e215: 6 total, 6 up, 6 in Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.804 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.826 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:06:19 localhost nova_compute[281952]: 2025-11-23 10:06:19.827 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:06:20 localhost nova_compute[281952]: 2025-11-23 10:06:20.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e216 e216: 6 total, 6 up, 6 in Nov 23 05:06:21 localhost nova_compute[281952]: 2025-11-23 10:06:21.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:21 localhost nova_compute[281952]: 2025-11-23 10:06:21.212 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:06:22 localhost podman[334122]: 2025-11-23 10:06:22.043588218 +0000 UTC m=+0.091646292 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7) Nov 23 05:06:22 localhost podman[334122]: 2025-11-23 10:06:22.057326202 +0000 UTC m=+0.105384266 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 05:06:22 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:06:22 localhost podman[334120]: 2025-11-23 10:06:22.077569542 +0000 UTC m=+0.133389259 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2) Nov 23 05:06:22 localhost podman[334120]: 2025-11-23 10:06:22.11733794 +0000 UTC m=+0.173157697 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 05:06:22 localhost podman[334121]: 2025-11-23 10:06:22.133378453 +0000 UTC m=+0.185915202 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:06:22 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:06:22 localhost podman[334121]: 2025-11-23 10:06:22.142134677 +0000 UTC m=+0.194671416 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:06:22 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.991 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.993 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:22 localhost nova_compute[281952]: 2025-11-23 10:06:22.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:23 localhost nova_compute[281952]: 2025-11-23 10:06:23.025 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:23 localhost nova_compute[281952]: 2025-11-23 10:06:23.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:23 localhost nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:23 localhost nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:23 localhost nova_compute[281952]: 2025-11-23 10:06:23.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e217 e217: 6 total, 6 up, 6 in Nov 23 05:06:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e218 e218: 6 total, 6 up, 6 in Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.231 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.232 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.233 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:06:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e219 e219: 6 total, 6 up, 6 in Nov 23 05:06:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:06:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/932944577' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.729 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.812 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.812 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.987 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.988 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11104MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:06:25 localhost nova_compute[281952]: 2025-11-23 10:06:25.989 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.063 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.064 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.064 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.114 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:06:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:06:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2425650944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.567 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.574 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.589 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.592 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:06:26 localhost nova_compute[281952]: 2025-11-23 10:06:26.592 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:06:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e220 e220: 6 total, 6 up, 6 in Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.027 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.029 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.058 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:28 localhost nova_compute[281952]: 2025-11-23 10:06:28.059 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:06:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:06:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:06:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1242299826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:06:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e221 e221: 6 total, 6 up, 6 in Nov 23 05:06:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:29.801 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:06:29 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:29.802 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:06:29 localhost nova_compute[281952]: 2025-11-23 10:06:29.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:29 localhost openstack_network_exporter[242668]: ERROR 10:06:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:06:29 localhost openstack_network_exporter[242668]: ERROR 10:06:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:06:29 localhost openstack_network_exporter[242668]: ERROR 10:06:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:06:29 localhost openstack_network_exporter[242668]: ERROR 10:06:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:06:29 localhost openstack_network_exporter[242668]: Nov 23 05:06:29 localhost openstack_network_exporter[242668]: ERROR 10:06:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:06:29 localhost openstack_network_exporter[242668]: Nov 23 05:06:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:30.152 263258 INFO neutron.agent.linux.ip_lib [None req-9b2a92cf-79b1-4b60-8412-cb888fcbdc3e - - - - - -] Device tap6d2dff40-04 cannot be used as it has no MAC address#033[00m Nov 23 05:06:30 localhost nova_compute[281952]: 2025-11-23 10:06:30.173 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:30 localhost kernel: device tap6d2dff40-04 entered promiscuous mode Nov 23 05:06:30 localhost NetworkManager[5975]: [1763892390.1823] manager: (tap6d2dff40-04): new Generic device (/org/freedesktop/NetworkManager/Devices/86) Nov 23 05:06:30 localhost nova_compute[281952]: 2025-11-23 10:06:30.181 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:30 localhost ovn_controller[154788]: 2025-11-23T10:06:30Z|00543|binding|INFO|Claiming lport 6d2dff40-048d-4175-8e87-c4c88e21141f for this chassis. Nov 23 05:06:30 localhost ovn_controller[154788]: 2025-11-23T10:06:30Z|00544|binding|INFO|6d2dff40-048d-4175-8e87-c4c88e21141f: Claiming unknown Nov 23 05:06:30 localhost systemd-udevd[334236]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:06:30 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:30.193 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33c4eecf43aa413a9f282206f9e9a55b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e2548aa-bcaa-4071-886c-d49df70c86b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6d2dff40-048d-4175-8e87-c4c88e21141f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:06:30 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:30.195 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6d2dff40-048d-4175-8e87-c4c88e21141f in datapath 1ebb6643-dd69-425a-84e7-f74c46a69f9f bound to our chassis#033[00m Nov 23 05:06:30 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:30.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7ef8df20-b914-4e91-b454-846b497693d6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:06:30 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:30.197 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ebb6643-dd69-425a-84e7-f74c46a69f9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:06:30 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:30.198 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[939ffe9c-a6de-49d8-82e1-ea2e3976c71f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost ovn_controller[154788]: 2025-11-23T10:06:30Z|00545|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f ovn-installed in OVS Nov 23 05:06:30 localhost ovn_controller[154788]: 2025-11-23T10:06:30Z|00546|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f up in Southbound Nov 23 05:06:30 localhost nova_compute[281952]: 2025-11-23 10:06:30.229 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost journal[230249]: ethtool ioctl error on tap6d2dff40-04: No such device Nov 23 05:06:30 localhost nova_compute[281952]: 2025-11-23 10:06:30.273 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:30 localhost nova_compute[281952]: 2025-11-23 10:06:30.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:30 localhost systemd[1]: tmp-crun.Cy72aH.mount: Deactivated successfully. Nov 23 05:06:30 localhost podman[334238]: 2025-11-23 10:06:30.323738333 +0000 UTC m=+0.109886941 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:06:30 localhost podman[334239]: 2025-11-23 10:06:30.356117518 +0000 UTC m=+0.139627057 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:06:30 localhost podman[334238]: 2025-11-23 10:06:30.389311719 +0000 UTC m=+0.175460377 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:06:30 localhost podman[334239]: 2025-11-23 10:06:30.392500194 +0000 UTC m=+0.176009733 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:06:30 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:06:30 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:06:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e222 e222: 6 total, 6 up, 6 in Nov 23 05:06:31 localhost podman[334349]: Nov 23 05:06:31 localhost podman[334349]: 2025-11-23 10:06:31.176514771 +0000 UTC m=+0.079547188 container create 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 23 05:06:31 localhost systemd[1]: Started libpod-conmon-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope. Nov 23 05:06:31 localhost podman[334349]: 2025-11-23 10:06:31.134959759 +0000 UTC m=+0.037992206 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:06:31 localhost systemd[1]: Started libcrun container. Nov 23 05:06:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bbb7d3b4428d8f687dacc74829ec1d2b43a77a0b9743c4105e027e24960cc16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:06:31 localhost podman[334349]: 2025-11-23 10:06:31.258043327 +0000 UTC m=+0.161075744 container init 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:06:31 localhost podman[334349]: 2025-11-23 10:06:31.265627906 +0000 UTC m=+0.168660323 container start 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:06:31 localhost dnsmasq[334367]: started, version 2.85 cachesize 150 Nov 23 05:06:31 localhost dnsmasq[334367]: DNS service limited to local subnets Nov 23 05:06:31 localhost dnsmasq[334367]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:06:31 localhost dnsmasq[334367]: warning: no upstream servers configured Nov 23 05:06:31 localhost dnsmasq-dhcp[334367]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:06:31 localhost dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 0 addresses Nov 23 05:06:31 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host Nov 23 05:06:31 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts Nov 23 05:06:31 localhost nova_compute[281952]: 2025-11-23 10:06:31.349 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:31.513 263258 INFO neutron.agent.dhcp.agent [None req-76e000ab-2e09-48cd-82d9-c82e540c898d - - - - - -] DHCP configuration for ports {'38e3101c-d0ed-4881-8ac3-7f252b0ff879'} is completed#033[00m Nov 23 05:06:31 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e223 e223: 6 total, 6 up, 6 in Nov 23 05:06:31 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:31.952 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:06:31Z, description=, device_id=48f6cfc5-8b79-494b-95ec-92da5372a95b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ceb30ad9-170b-4e72-94b3-c6a64398b0a8, ip_allocation=immediate, mac_address=fa:16:3e:a7:c9:b7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:06:27Z, description=, dns_domain=, id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1404220004-network, port_security_enabled=True, project_id=33c4eecf43aa413a9f282206f9e9a55b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3191, status=ACTIVE, subnets=['eee79f80-643c-4af9-96bb-e3244dbe6f93'], tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:28Z, vlan_transparent=None, network_id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, port_security_enabled=False, project_id=33c4eecf43aa413a9f282206f9e9a55b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3221, status=DOWN, tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:31Z on network 1ebb6643-dd69-425a-84e7-f74c46a69f9f#033[00m Nov 23 05:06:32 localhost dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 1 addresses Nov 23 05:06:32 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host Nov 23 05:06:32 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts Nov 23 05:06:32 localhost podman[334385]: 2025-11-23 10:06:32.166010728 +0000 UTC m=+0.056546054 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:06:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:32.449 263258 INFO neutron.agent.dhcp.agent [None req-4223f2e0-8720-4d35-b49f-5ca212290d91 - - - - - -] DHCP configuration for ports {'ceb30ad9-170b-4e72-94b3-c6a64398b0a8'} is completed#033[00m Nov 23 05:06:32 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:32.558 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:06:31Z, description=, device_id=48f6cfc5-8b79-494b-95ec-92da5372a95b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ceb30ad9-170b-4e72-94b3-c6a64398b0a8, ip_allocation=immediate, mac_address=fa:16:3e:a7:c9:b7, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:06:27Z, description=, dns_domain=, id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1404220004-network, port_security_enabled=True, project_id=33c4eecf43aa413a9f282206f9e9a55b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3191, status=ACTIVE, subnets=['eee79f80-643c-4af9-96bb-e3244dbe6f93'], tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:28Z, vlan_transparent=None, network_id=1ebb6643-dd69-425a-84e7-f74c46a69f9f, port_security_enabled=False, project_id=33c4eecf43aa413a9f282206f9e9a55b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3221, status=DOWN, tags=[], tenant_id=33c4eecf43aa413a9f282206f9e9a55b, updated_at=2025-11-23T10:06:31Z on network 1ebb6643-dd69-425a-84e7-f74c46a69f9f#033[00m Nov 23 05:06:32 localhost dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 1 addresses Nov 23 05:06:32 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host Nov 23 05:06:32 localhost podman[334422]: 2025-11-23 10:06:32.769201767 +0000 UTC m=+0.063526173 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 23 05:06:32 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts Nov 23 05:06:33 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:06:33.001 263258 INFO neutron.agent.dhcp.agent [None req-f1c471c9-e3d3-465f-be83-59dc2ba0ec87 - - - - - -] DHCP configuration for ports {'ceb30ad9-170b-4e72-94b3-c6a64398b0a8'} is completed#033[00m Nov 23 05:06:33 localhost nova_compute[281952]: 2025-11-23 10:06:33.085 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e224 e224: 6 total, 6 up, 6 in Nov 23 05:06:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e225 e225: 6 total, 6 up, 6 in Nov 23 05:06:34 localhost nova_compute[281952]: 2025-11-23 10:06:34.547 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e226 e226: 6 total, 6 up, 6 in Nov 23 05:06:35 localhost ovn_metadata_agent[160434]: 2025-11-23 10:06:35.804 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:06:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e227 e227: 6 total, 6 up, 6 in Nov 23 05:06:38 localhost nova_compute[281952]: 2025-11-23 10:06:38.126 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e228 e228: 6 total, 6 up, 6 in Nov 23 05:06:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:38 localhost nova_compute[281952]: 2025-11-23 10:06:38.695 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e229 e229: 6 total, 6 up, 6 in Nov 23 05:06:40 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e230 e230: 6 total, 6 up, 6 in Nov 23 05:06:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:06:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:06:41 localhost podman[334444]: 2025-11-23 10:06:41.783115501 +0000 UTC m=+0.087585629 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:06:41 localhost podman[334444]: 2025-11-23 10:06:41.795080401 +0000 UTC m=+0.099550539 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:06:41 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:06:41 localhost podman[240668]: time="2025-11-23T10:06:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:06:41 localhost podman[334445]: 2025-11-23 10:06:41.905001523 +0000 UTC m=+0.208267655 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:06:41 localhost podman[240668]: @ - - [23/Nov/2025:10:06:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:06:42 localhost podman[334445]: 2025-11-23 10:06:42.037729031 +0000 UTC m=+0.340995193 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:06:42 localhost podman[240668]: @ - - [23/Nov/2025:10:06:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19263 "" "Go-http-client/1.1" Nov 23 05:06:42 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:06:43 localhost nova_compute[281952]: 2025-11-23 10:06:43.162 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e231 e231: 6 total, 6 up, 6 in Nov 23 05:06:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:45 localhost sshd[334501]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:47 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:06:47 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:48 localhost ovn_controller[154788]: 2025-11-23T10:06:48Z|00547|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:06:48 localhost nova_compute[281952]: 2025-11-23 10:06:48.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:48 localhost nova_compute[281952]: 2025-11-23 10:06:48.165 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e232 e232: 6 total, 6 up, 6 in Nov 23 05:06:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:49 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:06:50 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e233 e233: 6 total, 6 up, 6 in Nov 23 05:06:51 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e234 e234: 6 total, 6 up, 6 in Nov 23 05:06:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:06:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:06:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:06:53 localhost systemd[1]: tmp-crun.VilIgc.mount: Deactivated successfully. Nov 23 05:06:53 localhost podman[334634]: 2025-11-23 10:06:53.06781045 +0000 UTC m=+0.104726835 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible) Nov 23 05:06:53 localhost podman[334633]: 2025-11-23 10:06:53.086319718 +0000 UTC m=+0.128012167 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Nov 23 05:06:53 localhost podman[334634]: 2025-11-23 10:06:53.106329471 +0000 UTC m=+0.143245826 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 05:06:53 localhost podman[334633]: 2025-11-23 10:06:53.119154868 +0000 UTC m=+0.160847307 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 23 05:06:53 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:06:53 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.166 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.168 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.169 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.208 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:53 localhost nova_compute[281952]: 2025-11-23 10:06:53.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:53 localhost podman[334632]: 2025-11-23 10:06:53.213988823 +0000 UTC m=+0.256881479 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 23 05:06:53 localhost podman[334632]: 2025-11-23 10:06:53.25931939 +0000 UTC m=+0.302212005 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118) Nov 23 05:06:53 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:06:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e235 e235: 6 total, 6 up, 6 in Nov 23 05:06:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:06:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:06:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:06:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:54 localhost systemd[1]: tmp-crun.3H9WOk.mount: Deactivated successfully. Nov 23 05:06:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e236 e236: 6 total, 6 up, 6 in Nov 23 05:06:55 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e237 e237: 6 total, 6 up, 6 in Nov 23 05:06:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:06:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:06:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:06:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e238 e238: 6 total, 6 up, 6 in Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.210 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.212 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.250 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:06:58 localhost nova_compute[281952]: 2025-11-23 10:06:58.251 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:06:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:06:59 localhost openstack_network_exporter[242668]: ERROR 10:06:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:06:59 localhost openstack_network_exporter[242668]: ERROR 10:06:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:06:59 localhost openstack_network_exporter[242668]: ERROR 10:06:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:06:59 localhost openstack_network_exporter[242668]: ERROR 10:06:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:06:59 localhost openstack_network_exporter[242668]: Nov 23 05:06:59 localhost openstack_network_exporter[242668]: ERROR 10:06:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:06:59 localhost openstack_network_exporter[242668]: Nov 23 05:07:00 localhost sshd[334694]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:07:00 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:00 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:00 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:07:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:07:01 localhost podman[334696]: 2025-11-23 10:07:01.03739777 +0000 UTC m=+0.088435415 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118) Nov 23 05:07:01 localhost podman[334696]: 2025-11-23 10:07:01.054423093 +0000 UTC m=+0.105460728 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:07:01 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:07:01 localhost podman[334697]: 2025-11-23 10:07:01.138194166 +0000 UTC m=+0.186787988 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:07:01 localhost podman[334697]: 2025-11-23 10:07:01.152502677 +0000 UTC m=+0.201096519 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:07:01 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:07:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:07:02 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:07:02 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:07:02 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2643180548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:07:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e239 e239: 6 total, 6 up, 6 in Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.252 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.254 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.255 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.291 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:03 localhost nova_compute[281952]: 2025-11-23 10:07:03.292 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:07:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:07:06 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e240 e240: 6 total, 6 up, 6 in Nov 23 05:07:07 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e241 e241: 6 total, 6 up, 6 in Nov 23 05:07:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.191681) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428191732, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1726, "num_deletes": 271, "total_data_size": 2337193, "memory_usage": 2372272, "flush_reason": "Manual Compaction"} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428201419, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1529660, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29861, "largest_seqno": 31582, "table_properties": {"data_size": 1522091, "index_size": 4398, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18108, "raw_average_key_size": 21, "raw_value_size": 1506165, "raw_average_value_size": 1821, "num_data_blocks": 185, "num_entries": 827, "num_filter_entries": 827, "num_deletions": 271, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892363, "oldest_key_time": 1763892363, "file_creation_time": 1763892428, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 9787 microseconds, and 4425 cpu microseconds. Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.201467) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1529660 bytes OK Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.201489) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203335) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203356) EVENT_LOG_v1 {"time_micros": 1763892428203349, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.203375) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 2328704, prev total WAL file size 2329028, number of live WAL files 2. Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.204177) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323736' seq:72057594037927935, type:22 .. '6C6F676D0034353330' seq:0, type:0; will stop at (end) Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1493KB)], [51(17MB)] Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428204363, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 19406165, "oldest_snapshot_seqno": -1} Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.293 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.297 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 13509 keys, 18794195 bytes, temperature: kUnknown Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428316716, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 18794195, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18715615, "index_size": 43727, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33797, "raw_key_size": 364104, "raw_average_key_size": 26, "raw_value_size": 18483977, "raw_average_value_size": 1368, "num_data_blocks": 1631, "num_entries": 13509, "num_filter_entries": 13509, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892428, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.317096) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 18794195 bytes Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.318665) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.6 rd, 167.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 17.0 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(25.0) write-amplify(12.3) OK, records in: 14072, records dropped: 563 output_compression: NoCompression Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.318699) EVENT_LOG_v1 {"time_micros": 1763892428318684, "job": 30, "event": "compaction_finished", "compaction_time_micros": 112423, "compaction_time_cpu_micros": 54497, "output_level": 6, "num_output_files": 1, "total_output_size": 18794195, "num_input_records": 14072, "num_output_records": 13509, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428319080, "job": 30, "event": "table_file_deletion", "file_number": 53} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892428322152, "job": 30, "event": "table_file_deletion", "file_number": 51} Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.204046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322271) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:07:08.322274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.325 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:08 localhost nova_compute[281952]: 2025-11-23 10:07:08.325 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:09.304 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:07:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:07:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:07:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:07:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.809 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.811 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.834 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 18510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702ebf00-67e7-4667-b41c-b1bfe98b23c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18510000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:07:10.811212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2d1fc278-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.011661606, 'message_signature': '6fd29692fa780f32ad13bc269a42be5b9cc44c12359b2f3af77a261b399354cf'}]}, 'timestamp': '2025-11-23 10:07:10.835244', '_unique_id': 'ea68ad94b3964feebe712461762ee582'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.836 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910cdb68-81be-4af0-985c-3547252c47e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.838151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d24e24e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'cc82390c85ff8115669df78cc4a28826c0ef0f1ebb9aca20e7c8a57bcb0cbec8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.838151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d24f69e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '3a6e80cf3caec83700c8307a71372773639739e60455c6a7f3a26fc06e051733'}]}, 'timestamp': '2025-11-23 10:07:10.869299', '_unique_id': '18ef625ddb704309b7681e70f1991b15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.870 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.872 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.872 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1944a8c-7b1c-4ebb-ba6f-735d2c71c82b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.872469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2587ee-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c2c5533fb367025416b885fce924189890239146f91dd77344ac75081f97fa2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.872469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d25a38c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '9ea18120944125e828210bf579eb1f6cd4e912697c01b766c1abce5361422e7b'}]}, 'timestamp': '2025-11-23 10:07:10.873808', '_unique_id': '348a85024f1e4d87bc46d6a5c5cbc952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.874 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37713d6b-58c9-4d97-bd7b-e507c0bf326b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.876230', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2699b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'd4999ae9c3a1b73ebfd803a147b86f813f0b0f3872cfed562f94573231cfe609'}]}, 'timestamp': '2025-11-23 10:07:10.880053', '_unique_id': '89407cbdff294aa196340eefc534feea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.880 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd659421a-2eb8-4054-9bf0-3634abc4db0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.882544', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d270e84-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '326568e96ba3ddf239dcfbe78c573fb81bd5d76b7ad564eefdd255d7a927a507'}]}, 'timestamp': '2025-11-23 10:07:10.883095', '_unique_id': '6f041623dfcc4d9fb6536a1e987d86a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '836d5f87-8bfd-478d-84e0-cbbcd4245b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.885342', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d277d38-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'cca6dc8c1f1bfac62f7089d95e65a3c6ad62102fb288c1cf758f7e80b0c870b9'}]}, 'timestamp': '2025-11-23 10:07:10.885869', '_unique_id': '79938a5c2d584720a8b6fc6d3cb7d3ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.886 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.888 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdb3af6b-e771-483e-89b9-c910da77379c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.888219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d27ee12-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c29253a0a1f37f78ab961042fd733126f6a9186cea07ede5fd719cfd433492ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.888219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2800f0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'd8ffde8b82395dc1efc63967d1772b4da0e47a7e8491ceaa9550a345f412fcf9'}]}, 'timestamp': '2025-11-23 10:07:10.889205', '_unique_id': '137489f9b15244da902aa3d1dd066cfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.890 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.891 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.892 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff88234c-74c7-44d4-978d-ab8a1e9488c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.891863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d287c1a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '1803a46c35ce7cd8167258a6cf0160675df80da67db7dac6fe6f64104a768bf4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.891863', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d28938a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '17328311758274f0ccb1c7d24335d3e50b7325d7709fc94105e340dd03bb2429'}]}, 'timestamp': '2025-11-23 10:07:10.893048', '_unique_id': '37531b99d83043f1b8f7389d63db4211'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.894 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.895 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d87fde-9430-498c-978e-8151fd66e6c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.895318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2af40e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '799dcfc4040890a05d24f9575006b444c9dd61b51988e5bf69565a98062807a8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.895318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2b0fe8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': 'f2a5484b774d529d25085012e6b8ccf0e6f7cb9e8f54fab1c1f62e0d541cd778'}]}, 'timestamp': '2025-11-23 10:07:10.909258', '_unique_id': '58a340c9dbb247c4a4c8218410ca8aa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.911 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8cdfebb-0aae-47cc-bbd4-7165dfa454c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.911755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2b8446-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '8cd5327536e1cdbb24e1fc94e11432c19ece66307cb0fbf0ede1700037f1b349'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.911755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2b9490-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'c868d0227d31e5c8230f5110a5ff712cb0fd210820c9a7a9036b2b11481636ae'}]}, 'timestamp': '2025-11-23 10:07:10.912677', '_unique_id': 'fa06a6803f9244699b00e94c6036f89a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.914 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1abd3eab-8554-4d28-9aa5-aef99a232b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.914867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2bfdea-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': '0baa8921a4a50c16ea1f131a4606edbe7a045d8c6a924da7caeddc66b1330ef9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.914867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2c0dee-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.015790251, 'message_signature': 'f869ccb597f5a9e81187f5245150f89677356378882cba1c65b8629e29f1e961'}]}, 'timestamp': '2025-11-23 10:07:10.915732', '_unique_id': '8df1a57963584ff195d243ecbc82a7b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.916 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.918 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c246764-65c4-4ce8-80ae-26d6346bb8ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.918274', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2c82b0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '15ae8ca5a53c61d8eb179e5ddcfdd78c3b511287e966e69629ab75bf1b1144f2'}]}, 'timestamp': '2025-11-23 10:07:10.918752', '_unique_id': '5235d88f1b794a0d951a9a349a693b04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bd6003a-fde6-416b-968e-b96587292186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.921574', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d03f2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '665fa84ff51cf2059a6eee7e698bfb2ef370dc5986fad9587a48f5901655563e'}]}, 'timestamp': '2025-11-23 10:07:10.922162', '_unique_id': 'c31fbbb92ebd40279b8188f6c3c78dd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.922 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c62e230c-dc70-4b79-8952-f057f0a94d62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.923779', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d5834-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '6724ef73740848bcdd8172b20eb246307ff130ee0f823ed2f1957d636c9ea65a'}]}, 'timestamp': '2025-11-23 10:07:10.924180', '_unique_id': 'aa2541e71c20446f9ad3d8e4ccebe222'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d25186a-47d9-4d71-a994-9eba4d134104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.925635', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2d9e3e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'cdcbe87e64549613eaf91b45a41fc11a8bb15f3be3d8486055d70cbdb549c891'}]}, 'timestamp': '2025-11-23 10:07:10.925945', '_unique_id': '7ec1f9eaa1ca4173b6e39453f6410da2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.927 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f88fea-9dfb-4381-a82a-5413fa23e2be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.927450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2de51a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '49a6b9ce1a787d9ae5dc5f945d061bf25ace37eb456d58eb2b9b1b006471e349'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.927450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2def2e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '3dfcdbcaf280193bf5a1a6ceb0a8f80fc6602c59ea9992fd82989600efdad4b9'}]}, 'timestamp': '2025-11-23 10:07:10.928001', '_unique_id': '9dfcce2296024a72905adb94bfd2df40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb7e998d-1fdc-429d-b548-914528c9bd59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.929367', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2e2fde-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'c6d9cec669f13139885dd38d49866edbad00b5f915fb3721a3bd5317f28ea52f'}]}, 'timestamp': '2025-11-23 10:07:10.929672', '_unique_id': 'd83d542fcac046299064dbc0fa5369f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dc709d0-d6f1-49ec-a733-9069dd02599e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.931135', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2e7638-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': 'bc2c59b51f11d7c7cdad482c996cbcad033530d6f44f65d54bc5343ac6780487'}]}, 'timestamp': '2025-11-23 10:07:10.931467', '_unique_id': 'd44260105f9949069dae2d64690f3ffe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e321060-4e55-4221-8e42-cb2211aacbcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:07:10.933011', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '2d2ebe40-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.053873827, 'message_signature': '4c735549a849b89e608c274814dcd845eb397d2dde795a09f6a66b388366304a'}]}, 'timestamp': '2025-11-23 10:07:10.933297', '_unique_id': '0a32465bec49487388b1b146d9cd8e6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dae0b80-dec5-4c6b-b326-afc1d0d2f4b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:07:10.934770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2d2f03aa-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '60c37a78afba2c2267fc8858a21a81c8408ea2dd991e983d8f13322a8829cbd5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:07:10.934770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2d2f0db4-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.072954702, 'message_signature': '95d6cbae914e000e25b46ea751a6d73fa3f3346a3f3408eb8320c901948227a2'}]}, 'timestamp': '2025-11-23 10:07:10.935361', '_unique_id': 'fdb7e5b9cd9546c38c75bb6392c90255'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.936 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '629130f0-a4a2-4821-9179-9b552bbbbac7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:07:10.936853', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2d2f5562-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12374.011661606, 'message_signature': '9b9aa5fe46d45c3dcdea27c85801d419ed3d01ec5793793ce52bd67874add10e'}]}, 'timestamp': '2025-11-23 10:07:10.937156', '_unique_id': '1a2bbfe107544d51ae8ef7281a8e3e98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:07:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:07:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:07:11 localhost podman[240668]: time="2025-11-23T10:07:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:07:11 localhost podman[240668]: @ - - [23/Nov/2025:10:07:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:07:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:07:11 localhost podman[240668]: @ - - [23/Nov/2025:10:07:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19274 "" "Go-http-client/1.1" Nov 23 05:07:12 localhost systemd[1]: tmp-crun.UmYHJv.mount: Deactivated successfully. Nov 23 05:07:12 localhost podman[334738]: 2025-11-23 10:07:12.039869767 +0000 UTC m=+0.095380964 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:07:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:07:12 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:07:12 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:07:12 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/217994253' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:07:12 localhost podman[334738]: 2025-11-23 10:07:12.077421808 +0000 UTC m=+0.132932965 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:07:12 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:07:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:07:12 localhost podman[334759]: 2025-11-23 10:07:12.196439853 +0000 UTC m=+0.080046362 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:12 localhost podman[334759]: 2025-11-23 10:07:12.232809749 +0000 UTC m=+0.116416278 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm) Nov 23 05:07:12 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:07:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 e242: 6 total, 6 up, 6 in Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.326 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.329 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.350 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:13 localhost nova_compute[281952]: 2025-11-23 10:07:13.351 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:07:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.352 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.353 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.354 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.383 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.384 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:18 localhost nova_compute[281952]: 2025-11-23 10:07:18.467 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:07:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:20 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e243 e243: 6 total, 6 up, 6 in Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.265 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.594 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.594 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.595 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.669 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.670 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.670 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:07:21 localhost nova_compute[281952]: 2025-11-23 10:07:21.671 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:07:22 localhost nova_compute[281952]: 2025-11-23 10:07:22.131 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:07:22 localhost nova_compute[281952]: 2025-11-23 10:07:22.149 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:07:22 localhost nova_compute[281952]: 2025-11-23 10:07:22.150 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:07:22 localhost nova_compute[281952]: 2025-11-23 10:07:22.151 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:22 localhost nova_compute[281952]: 2025-11-23 10:07:22.151 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:22 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e244 e244: 6 total, 6 up, 6 in Nov 23 05:07:23 localhost nova_compute[281952]: 2025-11-23 10:07:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:23 localhost nova_compute[281952]: 2025-11-23 10:07:23.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:23 localhost nova_compute[281952]: 2025-11-23 10:07:23.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:07:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:07:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:07:24 localhost podman[334781]: 2025-11-23 10:07:24.049105489 +0000 UTC m=+0.095504229 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64) Nov 23 05:07:24 localhost podman[334779]: 2025-11-23 10:07:24.09631773 +0000 UTC m=+0.146746581 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Nov 23 05:07:24 localhost podman[334779]: 2025-11-23 10:07:24.144353018 +0000 UTC m=+0.194781929 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:07:24 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:07:24 localhost podman[334781]: 2025-11-23 10:07:24.166727422 +0000 UTC m=+0.213126192 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 05:07:24 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:07:24 localhost podman[334780]: 2025-11-23 10:07:24.148364298 +0000 UTC m=+0.197013445 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:24 localhost nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:24 localhost nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:24 localhost nova_compute[281952]: 2025-11-23 10:07:24.213 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:07:24 localhost podman[334780]: 2025-11-23 10:07:24.232477892 +0000 UTC m=+0.281126939 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:07:24 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:07:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:07:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:07:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:07:25 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:25.078 263258 INFO neutron.agent.linux.ip_lib [None req-f341419d-469a-487c-8bc9-f2728a88ac44 - - - - - -] Device tap93604dfc-74 cannot be used as it has no MAC address#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.136 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:25 localhost kernel: device tap93604dfc-74 entered promiscuous mode Nov 23 05:07:25 localhost NetworkManager[5975]: [1763892445.1451] manager: (tap93604dfc-74): new Generic device (/org/freedesktop/NetworkManager/Devices/87) Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.144 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:25 localhost ovn_controller[154788]: 2025-11-23T10:07:25Z|00548|binding|INFO|Claiming lport 93604dfc-7461-4299-8236-45aa7b97320e for this chassis. Nov 23 05:07:25 localhost ovn_controller[154788]: 2025-11-23T10:07:25Z|00549|binding|INFO|93604dfc-7461-4299-8236-45aa7b97320e: Claiming unknown Nov 23 05:07:25 localhost systemd-udevd[334850]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:07:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:25.157 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f48fa865c4047a080902678e51be06e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5f5ef48e-aaab-411a-9f8a-ffb4deb9829f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=93604dfc-7461-4299-8236-45aa7b97320e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:07:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:25.160 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 93604dfc-7461-4299-8236-45aa7b97320e in datapath 47f35c3f-de9a-4c96-9b45-f5782c8e0808 bound to our chassis#033[00m Nov 23 05:07:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:25.163 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port eece88c9-9692-4eda-92a8-4540577b0aac IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:07:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:25.164 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47f35c3f-de9a-4c96-9b45-f5782c8e0808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:07:25 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:25.165 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f9aa14-17fa-49e2-abdb-26c8af7fa6ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost ovn_controller[154788]: 2025-11-23T10:07:25Z|00550|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e ovn-installed in OVS Nov 23 05:07:25 localhost ovn_controller[154788]: 2025-11-23T10:07:25Z|00551|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e up in Southbound Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.186 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:25 localhost journal[230249]: ethtool ioctl error on tap93604dfc-74: No such device Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.233 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.234 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.235 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.235 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.259 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:07:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/872234812' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.690 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:07:25 localhost dnsmasq[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/addn_hosts - 0 addresses Nov 23 05:07:25 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/host Nov 23 05:07:25 localhost dnsmasq-dhcp[334367]: read /var/lib/neutron/dhcp/1ebb6643-dd69-425a-84e7-f74c46a69f9f/opts Nov 23 05:07:25 localhost systemd[1]: tmp-crun.NCFrQw.mount: Deactivated successfully. Nov 23 05:07:25 localhost podman[334930]: 2025-11-23 10:07:25.735326372 +0000 UTC m=+0.072980579 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.781 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:07:25 localhost nova_compute[281952]: 2025-11-23 10:07:25.782 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.013 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.015 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11061MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.015 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.016 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.086 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.147 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.209 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:26 localhost ovn_controller[154788]: 2025-11-23T10:07:26Z|00552|binding|INFO|Releasing lport 6d2dff40-048d-4175-8e87-c4c88e21141f from this chassis (sb_readonly=0) Nov 23 05:07:26 localhost ovn_controller[154788]: 2025-11-23T10:07:26Z|00553|binding|INFO|Setting lport 6d2dff40-048d-4175-8e87-c4c88e21141f down in Southbound Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.227 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:26 localhost kernel: device tap6d2dff40-04 left promiscuous mode Nov 23 05:07:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:26.237 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ebb6643-dd69-425a-84e7-f74c46a69f9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '33c4eecf43aa413a9f282206f9e9a55b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e2548aa-bcaa-4071-886c-d49df70c86b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6d2dff40-048d-4175-8e87-c4c88e21141f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:07:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:26.239 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 6d2dff40-048d-4175-8e87-c4c88e21141f in datapath 1ebb6643-dd69-425a-84e7-f74c46a69f9f unbound from our chassis#033[00m Nov 23 05:07:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:26.242 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1ebb6643-dd69-425a-84e7-f74c46a69f9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:07:26 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:26.243 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ab7b47-5436-4c23-b7b7-2e7472829035]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.248 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:26 localhost podman[334983]: Nov 23 05:07:26 localhost podman[334983]: 2025-11-23 10:07:26.343371049 +0000 UTC m=+0.078412793 container create d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:26 localhost systemd[1]: Started libpod-conmon-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope. Nov 23 05:07:26 localhost systemd[1]: Started libcrun container. Nov 23 05:07:26 localhost podman[334983]: 2025-11-23 10:07:26.299435555 +0000 UTC m=+0.034477289 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:07:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f836387aefaba3bec188b51689961ff9b6a2c957f76e955e6437c8b363a115f6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:07:26 localhost podman[334983]: 2025-11-23 10:07:26.412617334 +0000 UTC m=+0.147659078 container init d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118) Nov 23 05:07:26 localhost podman[334983]: 2025-11-23 10:07:26.421547513 +0000 UTC m=+0.156589257 container start d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:07:26 localhost dnsmasq[335020]: started, version 2.85 cachesize 150 Nov 23 05:07:26 localhost dnsmasq[335020]: DNS service limited to local subnets Nov 23 05:07:26 localhost dnsmasq[335020]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:07:26 localhost dnsmasq[335020]: warning: no upstream servers configured Nov 23 05:07:26 localhost dnsmasq-dhcp[335020]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:07:26 localhost dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 0 addresses Nov 23 05:07:26 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host Nov 23 05:07:26 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts Nov 23 05:07:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:07:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1410162980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:07:26 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:26.569 263258 INFO neutron.agent.dhcp.agent [None req-5e7a3a60-401e-44ac-9642-6923c08b11e5 - - - - - -] DHCP configuration for ports {'94b64606-a070-4d9f-a082-720cda79ef20'} is completed#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.582 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.588 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.604 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.627 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:07:26 localhost nova_compute[281952]: 2025-11-23 10:07:26.627 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:07:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:27.119 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:07:26Z, description=, device_id=911d76ef-8c35-4e07-a4bf-332c60f42359, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=757091d9-2f57-42df-833a-3480faaa4574, ip_allocation=immediate, mac_address=fa:16:3e:34:8e:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:07:22Z, description=, dns_domain=, id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2143275487-network, port_security_enabled=True, project_id=7f48fa865c4047a080902678e51be06e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3388, status=ACTIVE, subnets=['8c56fce8-3a69-4c2c-9d2a-3dee5aae6f6a'], tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:23Z, vlan_transparent=None, network_id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, port_security_enabled=False, project_id=7f48fa865c4047a080902678e51be06e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3411, status=DOWN, tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:26Z on network 47f35c3f-de9a-4c96-9b45-f5782c8e0808#033[00m Nov 23 05:07:27 localhost dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 1 addresses Nov 23 05:07:27 localhost podman[335039]: 2025-11-23 10:07:27.324620477 +0000 UTC m=+0.054659368 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:07:27 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host Nov 23 05:07:27 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts Nov 23 05:07:27 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:27.514 263258 INFO neutron.agent.dhcp.agent [None req-650de9cb-e1f6-4cee-ae94-d084be5bb08b - - - - - -] DHCP configuration for ports {'757091d9-2f57-42df-833a-3480faaa4574'} is completed#033[00m Nov 23 05:07:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:07:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e245 e245: 6 total, 6 up, 6 in Nov 23 05:07:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:28.225 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:07:26Z, description=, device_id=911d76ef-8c35-4e07-a4bf-332c60f42359, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=757091d9-2f57-42df-833a-3480faaa4574, ip_allocation=immediate, mac_address=fa:16:3e:34:8e:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:07:22Z, description=, dns_domain=, id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2143275487-network, port_security_enabled=True, project_id=7f48fa865c4047a080902678e51be06e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58425, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3388, status=ACTIVE, subnets=['8c56fce8-3a69-4c2c-9d2a-3dee5aae6f6a'], tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:23Z, vlan_transparent=None, network_id=47f35c3f-de9a-4c96-9b45-f5782c8e0808, port_security_enabled=False, project_id=7f48fa865c4047a080902678e51be06e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3411, status=DOWN, tags=[], tenant_id=7f48fa865c4047a080902678e51be06e, updated_at=2025-11-23T10:07:26Z on network 47f35c3f-de9a-4c96-9b45-f5782c8e0808#033[00m Nov 23 05:07:28 localhost systemd[1]: tmp-crun.gIjIuB.mount: Deactivated successfully. Nov 23 05:07:28 localhost nova_compute[281952]: 2025-11-23 10:07:28.464 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:28 localhost dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 1 addresses Nov 23 05:07:28 localhost podman[335077]: 2025-11-23 10:07:28.467128383 +0000 UTC m=+0.089018332 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:07:28 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host Nov 23 05:07:28 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts Nov 23 05:07:28 localhost sshd[335099]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:07:28 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:28.814 263258 INFO neutron.agent.dhcp.agent [None req-7b6982cd-4091-40a7-bb11-ef619302e77f - - - - - -] DHCP configuration for ports {'757091d9-2f57-42df-833a-3480faaa4574'} is completed#033[00m Nov 23 05:07:29 localhost ovn_controller[154788]: 2025-11-23T10:07:29Z|00554|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:07:29 localhost nova_compute[281952]: 2025-11-23 10:07:29.091 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:29 localhost podman[335116]: 2025-11-23 10:07:29.700267729 +0000 UTC m=+0.064318988 container kill 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:07:29 localhost systemd[1]: tmp-crun.ZxfNHE.mount: Deactivated successfully. Nov 23 05:07:29 localhost dnsmasq[334367]: exiting on receipt of SIGTERM Nov 23 05:07:29 localhost systemd[1]: libpod-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope: Deactivated successfully. Nov 23 05:07:29 localhost podman[335128]: 2025-11-23 10:07:29.769786423 +0000 UTC m=+0.056471833 container died 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:29 localhost podman[335128]: 2025-11-23 10:07:29.810694725 +0000 UTC m=+0.097380095 container cleanup 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:07:29 localhost systemd[1]: libpod-conmon-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5.scope: Deactivated successfully. Nov 23 05:07:29 localhost podman[335130]: 2025-11-23 10:07:29.849554116 +0000 UTC m=+0.127836422 container remove 3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ebb6643-dd69-425a-84e7-f74c46a69f9f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:07:29 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:29.876 263258 INFO neutron.agent.dhcp.agent [None req-dd374a4a-4bcb-4679-9310-025822dfd3c4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:07:29 localhost openstack_network_exporter[242668]: ERROR 10:07:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:07:29 localhost openstack_network_exporter[242668]: ERROR 10:07:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:07:29 localhost openstack_network_exporter[242668]: Nov 23 05:07:29 localhost openstack_network_exporter[242668]: ERROR 10:07:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:07:29 localhost openstack_network_exporter[242668]: ERROR 10:07:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:07:29 localhost openstack_network_exporter[242668]: ERROR 10:07:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:07:29 localhost openstack_network_exporter[242668]: Nov 23 05:07:30 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:30.064 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:07:30 localhost systemd[1]: var-lib-containers-storage-overlay-3bbb7d3b4428d8f687dacc74829ec1d2b43a77a0b9743c4105e027e24960cc16-merged.mount: Deactivated successfully. Nov 23 05:07:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3501eecc07cadf66c2b5df826bc6b39207caa11b46cac92ce4cab5960a8e22c5-userdata-shm.mount: Deactivated successfully. Nov 23 05:07:30 localhost systemd[1]: run-netns-qdhcp\x2d1ebb6643\x2ddd69\x2d425a\x2d84e7\x2df74c46a69f9f.mount: Deactivated successfully. Nov 23 05:07:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:07:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:07:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:07:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:07:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:32.033 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:07:32 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:32.034 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:07:32 localhost systemd[1]: tmp-crun.ajT5gZ.mount: Deactivated successfully. Nov 23 05:07:32 localhost nova_compute[281952]: 2025-11-23 10:07:32.069 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:32 localhost podman[335157]: 2025-11-23 10:07:32.110197434 +0000 UTC m=+0.156397893 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:07:32 localhost podman[335157]: 2025-11-23 10:07:32.123849515 +0000 UTC m=+0.170049954 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:07:32 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:07:32 localhost podman[335156]: 2025-11-23 10:07:32.075959332 +0000 UTC m=+0.122005506 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd) Nov 23 05:07:32 localhost podman[335156]: 2025-11-23 10:07:32.206445052 +0000 UTC m=+0.252491176 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:07:32 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:07:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e246 e246: 6 total, 6 up, 6 in Nov 23 05:07:33 localhost nova_compute[281952]: 2025-11-23 10:07:33.505 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:33 localhost nova_compute[281952]: 2025-11-23 10:07:33.624 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:07:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e247 e247: 6 total, 6 up, 6 in Nov 23 05:07:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:07:35 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:07:35 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:07:35 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2326622654' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:07:37 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:37.036 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:07:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:07:37 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:07:37 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:07:37 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/491875440' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:07:37 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:37 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:07:37 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:07:37 localhost sshd[335198]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.507 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.509 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.543 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:38 localhost nova_compute[281952]: 2025-11-23 10:07:38.544 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:39 localhost ovn_controller[154788]: 2025-11-23T10:07:39Z|00555|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:07:39 localhost nova_compute[281952]: 2025-11-23 10:07:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:40 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Nov 23 05:07:40 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:40 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:40 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:41 localhost podman[335217]: 2025-11-23 10:07:41.608027606 +0000 UTC m=+0.064951767 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 23 05:07:41 localhost dnsmasq[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/addn_hosts - 0 addresses Nov 23 05:07:41 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/host Nov 23 05:07:41 localhost dnsmasq-dhcp[335020]: read /var/lib/neutron/dhcp/47f35c3f-de9a-4c96-9b45-f5782c8e0808/opts Nov 23 05:07:41 localhost systemd[1]: tmp-crun.JNEBJD.mount: Deactivated successfully. Nov 23 05:07:41 localhost kernel: device tap93604dfc-74 left promiscuous mode Nov 23 05:07:41 localhost ovn_controller[154788]: 2025-11-23T10:07:41Z|00556|binding|INFO|Releasing lport 93604dfc-7461-4299-8236-45aa7b97320e from this chassis (sb_readonly=0) Nov 23 05:07:41 localhost ovn_controller[154788]: 2025-11-23T10:07:41Z|00557|binding|INFO|Setting lport 93604dfc-7461-4299-8236-45aa7b97320e down in Southbound Nov 23 05:07:41 localhost nova_compute[281952]: 2025-11-23 10:07:41.798 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:41.808 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47f35c3f-de9a-4c96-9b45-f5782c8e0808', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7f48fa865c4047a080902678e51be06e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5f5ef48e-aaab-411a-9f8a-ffb4deb9829f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=93604dfc-7461-4299-8236-45aa7b97320e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:07:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:41.811 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 93604dfc-7461-4299-8236-45aa7b97320e in datapath 47f35c3f-de9a-4c96-9b45-f5782c8e0808 unbound from our chassis#033[00m Nov 23 05:07:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:41.814 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47f35c3f-de9a-4c96-9b45-f5782c8e0808, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:07:41 localhost ovn_metadata_agent[160434]: 2025-11-23 10:07:41.815 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[d39f61b2-4da1-40ff-9ccd-ee4ad4923457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:07:41 localhost nova_compute[281952]: 2025-11-23 10:07:41.820 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:41 localhost nova_compute[281952]: 2025-11-23 10:07:41.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:41 localhost podman[240668]: time="2025-11-23T10:07:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:07:41 localhost podman[240668]: @ - - [23/Nov/2025:10:07:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:07:41 localhost podman[240668]: @ - - [23/Nov/2025:10:07:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19261 "" "Go-http-client/1.1" Nov 23 05:07:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:07:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:07:42 localhost ovn_controller[154788]: 2025-11-23T10:07:42Z|00558|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:07:43 localhost nova_compute[281952]: 2025-11-23 10:07:43.060 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:43 localhost podman[335240]: 2025-11-23 10:07:43.066632404 +0000 UTC m=+0.125159202 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:07:43 localhost podman[335241]: 2025-11-23 10:07:43.105356421 +0000 UTC m=+0.160659531 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:43 localhost podman[335241]: 2025-11-23 10:07:43.144281273 +0000 UTC m=+0.199584373 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 23 05:07:43 localhost podman[335240]: 2025-11-23 10:07:43.155397958 +0000 UTC m=+0.213924766 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:07:43 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:07:43 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:07:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e248 e248: 6 total, 6 up, 6 in Nov 23 05:07:43 localhost dnsmasq[335020]: exiting on receipt of SIGTERM Nov 23 05:07:43 localhost podman[335299]: 2025-11-23 10:07:43.43135151 +0000 UTC m=+0.064415761 container kill d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 23 05:07:43 localhost systemd[1]: libpod-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope: Deactivated successfully. Nov 23 05:07:43 localhost podman[335311]: 2025-11-23 10:07:43.50338691 +0000 UTC m=+0.060606167 container died d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:07:43 localhost podman[335311]: 2025-11-23 10:07:43.533822697 +0000 UTC m=+0.091041914 container cleanup d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 23 05:07:43 localhost systemd[1]: libpod-conmon-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790.scope: Deactivated successfully. Nov 23 05:07:43 localhost nova_compute[281952]: 2025-11-23 10:07:43.546 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:43 localhost nova_compute[281952]: 2025-11-23 10:07:43.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:43 localhost podman[335313]: 2025-11-23 10:07:43.591318099 +0000 UTC m=+0.140616407 container remove d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-47f35c3f-de9a-4c96-9b45-f5782c8e0808, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:07:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:43.615 263258 INFO neutron.agent.dhcp.agent [None req-8522ce37-688f-49e9-b944-3c116d7c1182 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:07:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:07:43.616 263258 INFO neutron.agent.dhcp.agent [None req-8522ce37-688f-49e9-b944-3c116d7c1182 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:07:44 localhost systemd[1]: var-lib-containers-storage-overlay-f836387aefaba3bec188b51689961ff9b6a2c957f76e955e6437c8b363a115f6-merged.mount: Deactivated successfully. Nov 23 05:07:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d64194948074e2bd82cdf82b556332a0a68240959bab5755063e152e9cb18790-userdata-shm.mount: Deactivated successfully. Nov 23 05:07:44 localhost systemd[1]: run-netns-qdhcp\x2d47f35c3f\x2dde9a\x2d4c96\x2d9b45\x2df5782c8e0808.mount: Deactivated successfully. Nov 23 05:07:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e249 e249: 6 total, 6 up, 6 in Nov 23 05:07:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:07:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:07:44 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:07:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:46 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e250 e250: 6 total, 6 up, 6 in Nov 23 05:07:47 localhost sshd[335375]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:07:47 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:47 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:47 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.548 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.551 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.552 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:48 localhost nova_compute[281952]: 2025-11-23 10:07:48.556 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:07:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:07:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:49 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:07:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:07:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:07:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e251 e251: 6 total, 6 up, 6 in Nov 23 05:07:53 localhost nova_compute[281952]: 2025-11-23 10:07:53.553 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:07:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:07:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:07:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:07:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:07:55 localhost systemd[1]: tmp-crun.DnZIWP.mount: Deactivated successfully. Nov 23 05:07:55 localhost podman[335428]: 2025-11-23 10:07:55.094060467 +0000 UTC m=+0.144586235 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 23 05:07:55 localhost podman[335428]: 2025-11-23 10:07:55.103319257 +0000 UTC m=+0.153845045 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:07:55 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:07:55 localhost podman[335427]: 2025-11-23 10:07:55.186876314 +0000 UTC m=+0.237161595 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:07:55 localhost podman[335429]: 2025-11-23 10:07:55.057144765 +0000 UTC m=+0.103493168 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible) Nov 23 05:07:55 localhost podman[335429]: 2025-11-23 10:07:55.243452628 +0000 UTC m=+0.289801081 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7) Nov 23 05:07:55 localhost podman[335427]: 2025-11-23 10:07:55.25649776 +0000 UTC m=+0.306783001 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:07:55 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:07:55 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:07:56 localhost systemd[1]: tmp-crun.DkAK4M.mount: Deactivated successfully. Nov 23 05:07:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:07:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:07:57 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.557 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.559 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.560 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.592 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:07:58 localhost nova_compute[281952]: 2025-11-23 10:07:58.593 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:07:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 e252: 6 total, 6 up, 6 in Nov 23 05:07:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:07:59 localhost openstack_network_exporter[242668]: ERROR 10:07:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:07:59 localhost openstack_network_exporter[242668]: ERROR 10:07:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:07:59 localhost openstack_network_exporter[242668]: ERROR 10:07:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:07:59 localhost openstack_network_exporter[242668]: ERROR 10:07:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:07:59 localhost openstack_network_exporter[242668]: Nov 23 05:07:59 localhost openstack_network_exporter[242668]: ERROR 10:07:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:07:59 localhost openstack_network_exporter[242668]: Nov 23 05:08:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:08:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:08:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:08:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3643344025' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:08:00 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.376458) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480376490, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1381, "num_deletes": 254, "total_data_size": 1935404, "memory_usage": 1959552, "flush_reason": "Manual Compaction"} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480386164, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1268743, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31587, "largest_seqno": 32963, "table_properties": {"data_size": 1262817, "index_size": 3076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15367, "raw_average_key_size": 21, "raw_value_size": 1249959, "raw_average_value_size": 1775, "num_data_blocks": 133, "num_entries": 704, "num_filter_entries": 704, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892428, "oldest_key_time": 1763892428, "file_creation_time": 1763892480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 9762 microseconds, and 4767 cpu microseconds. Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.386215) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1268743 bytes OK Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.386239) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388782) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388806) EVENT_LOG_v1 {"time_micros": 1763892480388799, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.388826) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 1928387, prev total WAL file size 1928997, number of live WAL files 2. Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.391716) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1239KB)], [54(17MB)] Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480391872, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20062938, "oldest_snapshot_seqno": -1} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13678 keys, 18455108 bytes, temperature: kUnknown Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480489842, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 18455108, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18375768, "index_size": 44039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34245, "raw_key_size": 368943, "raw_average_key_size": 26, "raw_value_size": 18141616, "raw_average_value_size": 1326, "num_data_blocks": 1637, "num_entries": 13678, "num_filter_entries": 13678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.490228) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 18455108 bytes Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.493873) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 204.6 rd, 188.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 17.9 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(30.4) write-amplify(14.5) OK, records in: 14213, records dropped: 535 output_compression: NoCompression Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.493969) EVENT_LOG_v1 {"time_micros": 1763892480493953, "job": 32, "event": "compaction_finished", "compaction_time_micros": 98060, "compaction_time_cpu_micros": 56004, "output_level": 6, "num_output_files": 1, "total_output_size": 18455108, "num_input_records": 14213, "num_output_records": 13678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480494376, "job": 32, "event": "table_file_deletion", "file_number": 56} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892480497231, "job": 32, "event": "table_file_deletion", "file_number": 54} Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.391514) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:00 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:08:00.497284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:08:01 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:01 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:08:02 localhost podman[335490]: 2025-11-23 10:08:02.296967343 +0000 UTC m=+0.088442926 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:08:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:08:02 localhost podman[335490]: 2025-11-23 10:08:02.3347391 +0000 UTC m=+0.126214633 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:08:02 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:08:02 localhost systemd[1]: tmp-crun.pzolgs.mount: Deactivated successfully. Nov 23 05:08:02 localhost podman[335512]: 2025-11-23 10:08:02.405079149 +0000 UTC m=+0.083416893 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:08:02 localhost podman[335512]: 2025-11-23 10:08:02.417305448 +0000 UTC m=+0.095643182 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:08:02 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.594 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.596 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.634 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:03 localhost nova_compute[281952]: 2025-11-23 10:08:03.635 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:08:04 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:08:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e253 e253: 6 total, 6 up, 6 in Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.636 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.637 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.638 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.638 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.659 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:08 localhost nova_compute[281952]: 2025-11-23 10:08:08.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 e254: 6 total, 6 up, 6 in Nov 23 05:08:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:09.305 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:08:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:08:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:08:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:08:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:08:11 localhost podman[240668]: time="2025-11-23T10:08:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:08:11 localhost podman[240668]: @ - - [23/Nov/2025:10:08:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:08:11 localhost podman[240668]: @ - - [23/Nov/2025:10:08:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18781 "" "Go-http-client/1.1" Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.660 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.662 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:13 localhost nova_compute[281952]: 2025-11-23 10:08:13.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:13 localhost ovn_controller[154788]: 2025-11-23T10:08:13Z|00559|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Nov 23 05:08:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:08:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:08:14 localhost podman[335531]: 2025-11-23 10:08:14.04834855 +0000 UTC m=+0.104308384 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:08:14 localhost podman[335532]: 2025-11-23 10:08:14.088412416 +0000 UTC m=+0.140592466 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:08:14 localhost podman[335532]: 2025-11-23 10:08:14.095205211 +0000 UTC m=+0.147385211 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible) Nov 23 05:08:14 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:08:14 localhost podman[335531]: 2025-11-23 10:08:14.112328126 +0000 UTC m=+0.168287960 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:08:14 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:08:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:08:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.692 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.721 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:18 localhost nova_compute[281952]: 2025-11-23 10:08:18.722 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e255 e255: 6 total, 6 up, 6 in Nov 23 05:08:19 localhost nova_compute[281952]: 2025-11-23 10:08:19.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:21 localhost nova_compute[281952]: 2025-11-23 10:08:21.223 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.358 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.359 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.360 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.360 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.853 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.875 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.876 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:08:22 localhost nova_compute[281952]: 2025-11-23 10:08:22.876 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.722 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.760 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:23 localhost nova_compute[281952]: 2025-11-23 10:08:23.761 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:24 localhost nova_compute[281952]: 2025-11-23 10:08:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:24 localhost nova_compute[281952]: 2025-11-23 10:08:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 e256: 6 total, 6 up, 6 in Nov 23 05:08:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:08:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:08:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.239 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.240 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.241 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:08:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:08:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/371791075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.689 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.748 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.748 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.933 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.934 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11067MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.935 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:08:25 localhost nova_compute[281952]: 2025-11-23 10:08:25.935 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:08:26 localhost podman[335597]: 2025-11-23 10:08:26.043240681 +0000 UTC m=+0.093756725 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:08:26 localhost podman[335597]: 2025-11-23 10:08:26.075668158 +0000 UTC m=+0.126184202 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 23 05:08:26 localhost systemd[1]: tmp-crun.snWwBl.mount: Deactivated successfully. Nov 23 05:08:26 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:08:26 localhost podman[335598]: 2025-11-23 10:08:26.094273258 +0000 UTC m=+0.140693369 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:08:26 localhost podman[335598]: 2025-11-23 10:08:26.126294553 +0000 UTC m=+0.172714634 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:08:26 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.174 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.174 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.175 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:08:26 localhost podman[335599]: 2025-11-23 10:08:26.200043855 +0000 UTC m=+0.245116425 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal) Nov 23 05:08:26 localhost podman[335599]: 2025-11-23 10:08:26.220963685 +0000 UTC m=+0.266036265 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 23 05:08:26 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.404 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:08:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:08:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2136971538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.892 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.898 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.915 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.951 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:08:26 localhost nova_compute[281952]: 2025-11-23 10:08:26.951 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:08:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:08:27 localhost nova_compute[281952]: 2025-11-23 10:08:27.952 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:27 localhost nova_compute[281952]: 2025-11-23 10:08:27.953 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:28 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:28 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.763 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.765 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.801 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:28 localhost nova_compute[281952]: 2025-11-23 10:08:28.802 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:29 localhost openstack_network_exporter[242668]: ERROR 10:08:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:08:29 localhost openstack_network_exporter[242668]: ERROR 10:08:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:08:29 localhost openstack_network_exporter[242668]: ERROR 10:08:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:08:29 localhost openstack_network_exporter[242668]: ERROR 10:08:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:08:29 localhost openstack_network_exporter[242668]: Nov 23 05:08:29 localhost openstack_network_exporter[242668]: ERROR 10:08:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:08:29 localhost openstack_network_exporter[242668]: Nov 23 05:08:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:08:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:08:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:08:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/f99c71c2-29eb-4c61-ab24-07800f6e4b24/458725ea-f0f9-482a-b03b-0e03f2cd7019", "osd", "allow rw pool=manila_data namespace=fsvolumens_f99c71c2-29eb-4c61-ab24-07800f6e4b24", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/f99c71c2-29eb-4c61-ab24-07800f6e4b24/458725ea-f0f9-482a-b03b-0e03f2cd7019", "osd", "allow rw pool=manila_data namespace=fsvolumens_f99c71c2-29eb-4c61-ab24-07800f6e4b24", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:08:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:08:33 localhost systemd[1]: tmp-crun.3fmRYD.mount: Deactivated successfully. Nov 23 05:08:33 localhost podman[335682]: 2025-11-23 10:08:33.03482567 +0000 UTC m=+0.084003631 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2) Nov 23 05:08:33 localhost podman[335683]: 2025-11-23 10:08:33.100196389 +0000 UTC m=+0.147922246 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:08:33 localhost podman[335682]: 2025-11-23 10:08:33.128313156 +0000 UTC m=+0.177491147 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118) Nov 23 05:08:33 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:08:33 localhost podman[335683]: 2025-11-23 10:08:33.185061486 +0000 UTC m=+0.232787363 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:08:33 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 23 05:08:33 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 e257: 6 total, 6 up, 6 in Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.804 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.806 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.847 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:33 localhost nova_compute[281952]: 2025-11-23 10:08:33.848 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:08:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:36 localhost nova_compute[281952]: 2025-11-23 10:08:36.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:08:36 localhost nova_compute[281952]: 2025-11-23 10:08:36.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 23 05:08:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:08:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:08:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:08:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:08:38 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.849 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.851 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.880 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:38 localhost nova_compute[281952]: 2025-11-23 10:08:38.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:41 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:41 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:41 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:41 localhost podman[240668]: time="2025-11-23T10:08:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:08:41 localhost podman[240668]: @ - - [23/Nov/2025:10:08:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:08:41 localhost podman[240668]: @ - - [23/Nov/2025:10:08:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18786 "" "Go-http-client/1.1" Nov 23 05:08:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/0066f586-ad80-42ee-9cb5-57bd65fc15e2/7acd4543-b091-4fe9-abe4-01b783b9e751", "osd", "allow rw pool=manila_data namespace=fsvolumens_0066f586-ad80-42ee-9cb5-57bd65fc15e2", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/0066f586-ad80-42ee-9cb5-57bd65fc15e2/7acd4543-b091-4fe9-abe4-01b783b9e751", "osd", "allow rw pool=manila_data namespace=fsvolumens_0066f586-ad80-42ee-9cb5-57bd65fc15e2", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.881 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.911 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.912 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:43 localhost nova_compute[281952]: 2025-11-23 10:08:43.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:08:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:08:45 localhost systemd[1]: tmp-crun.8snAH1.mount: Deactivated successfully. Nov 23 05:08:45 localhost podman[335723]: 2025-11-23 10:08:45.043682448 +0000 UTC m=+0.092415435 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:08:45 localhost podman[335723]: 2025-11-23 10:08:45.080476256 +0000 UTC m=+0.129209263 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:08:45 localhost systemd[1]: tmp-crun.wQRhyl.mount: Deactivated successfully. Nov 23 05:08:45 localhost podman[335724]: 2025-11-23 10:08:45.097969413 +0000 UTC m=+0.143437002 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:08:45 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:08:45 localhost podman[335724]: 2025-11-23 10:08:45.133562965 +0000 UTC m=+0.179030554 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 23 05:08:45 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:08:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:08:45 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:08:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:08:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:08:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:46.895 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:08:46 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:46.897 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:08:46 localhost nova_compute[281952]: 2025-11-23 10:08:46.899 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:48 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:48 localhost nova_compute[281952]: 2025-11-23 10:08:48.937 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:48 localhost nova_compute[281952]: 2025-11-23 10:08:48.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:49 localhost sshd[335832]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:08:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:08:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:08:50 localhost ovn_metadata_agent[160434]: 2025-11-23 10:08:50.899 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:08:51 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:08:51 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:08:51 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:08:51 localhost sshd[335852]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:08:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/01ed1422-dec9-4d44-991d-9583c95296ac/7b7be716-7e6a-49c4-a977-3707f1cc08b5", "osd", "allow rw pool=manila_data namespace=fsvolumens_01ed1422-dec9-4d44-991d-9583c95296ac", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/01ed1422-dec9-4d44-991d-9583c95296ac/7b7be716-7e6a-49c4-a977-3707f1cc08b5", "osd", "allow rw pool=manila_data namespace=fsvolumens_01ed1422-dec9-4d44-991d-9583c95296ac", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.969 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.971 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.972 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.973 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:53 localhost nova_compute[281952]: 2025-11-23 10:08:53.977 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:08:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:54 localhost sshd[335854]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:08:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:08:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:08:57 localhost podman[335857]: 2025-11-23 10:08:57.041261722 +0000 UTC m=+0.090424934 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 23 05:08:57 localhost podman[335857]: 2025-11-23 10:08:57.050279625 +0000 UTC m=+0.099442877 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:08:57 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:08:57 localhost podman[335858]: 2025-11-23 10:08:57.139984167 +0000 UTC m=+0.185901172 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41) Nov 23 05:08:57 localhost podman[335858]: 2025-11-23 10:08:57.152671389 +0000 UTC m=+0.198588404 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 05:08:57 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:08:57 localhost podman[335856]: 2025-11-23 10:08:57.243573537 +0000 UTC m=+0.292547984 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller) Nov 23 05:08:57 localhost podman[335856]: 2025-11-23 10:08:57.284388286 +0000 UTC m=+0.333362773 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 23 05:08:57 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:08:58 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:08:58 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:08:58 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:08:58 localhost nova_compute[281952]: 2025-11-23 10:08:58.978 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:58 localhost nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:08:58 localhost nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:08:58 localhost nova_compute[281952]: 2025-11-23 10:08:58.980 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:59 localhost nova_compute[281952]: 2025-11-23 10:08:59.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:08:59 localhost nova_compute[281952]: 2025-11-23 10:08:59.013 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:08:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:08:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:08:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:08:59 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:08:59 localhost openstack_network_exporter[242668]: ERROR 10:08:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:08:59 localhost openstack_network_exporter[242668]: ERROR 10:08:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:08:59 localhost openstack_network_exporter[242668]: ERROR 10:08:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:08:59 localhost openstack_network_exporter[242668]: ERROR 10:08:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:08:59 localhost openstack_network_exporter[242668]: Nov 23 05:08:59 localhost openstack_network_exporter[242668]: ERROR 10:08:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:08:59 localhost openstack_network_exporter[242668]: Nov 23 05:09:01 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:09:01 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:01 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:09:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:09:03 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:03 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:09:03 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:09:04 localhost podman[335919]: 2025-11-23 10:09:04.007833627 +0000 UTC m=+0.067820783 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.014 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.016 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.050 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:04 localhost nova_compute[281952]: 2025-11-23 10:09:04.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:04 localhost podman[335919]: 2025-11-23 10:09:04.078504777 +0000 UTC m=+0.138491883 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS) Nov 23 05:09:04 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:09:04 localhost systemd[1]: tmp-crun.swIPAv.mount: Deactivated successfully. Nov 23 05:09:04 localhost podman[335920]: 2025-11-23 10:09:04.116599944 +0000 UTC m=+0.173250720 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:09:04 localhost podman[335920]: 2025-11-23 10:09:04.128335257 +0000 UTC m=+0.184985993 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:09:04 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:09:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:05 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 23 05:09:05 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 23 05:09:05 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 23 05:09:05 localhost sshd[335962]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:09:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:07 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:08 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:09:08 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:08 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.051 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.053 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.054 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.054 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.094 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:09 localhost nova_compute[281952]: 2025-11-23 10:09:09.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:09:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:09.306 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:09:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:09:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:09:10 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.810 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.813 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7777f726-5b83-4975-a13b-d0701770fe49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.813354', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a3f7b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '6bdf39f3e705e56e3131f71ac15541bc29580bbd5536506e95120fc04071f1e4'}]}, 'timestamp': '2025-11-23 10:09:10.819993', '_unique_id': 'cdb8060dc1124f03809c862e76acebd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.822 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.824 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.824 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4b5d7f6-cda6-42c2-aa54-34fbe2436d95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.824426', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a4bfc2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': 'de46a22337f6e2ee85d7b4f55d9efbea32efd98ea289bec831b100d8a8491970'}]}, 'timestamp': '2025-11-23 10:09:10.825101', '_unique_id': 'a7e547c913ff4676a5eed98dc42291dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.826 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.828 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7e996a5-897d-42b0-bb12-56e23cc23b6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.828011', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a5491a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '61e2661dfe0a164d5562f1fdff7194074ac26c7968524846ae5df14b172958a5'}]}, 'timestamp': '2025-11-23 10:09:10.828522', '_unique_id': 'bd1e859dd82d4d03aec2d8277dcbcdc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.829 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.830 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.844 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '507c2d21-ca30-4654-a611-98267928dfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.830919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74a7dbbc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '45b7173f89095abe1c469b89d8801933bfc301096a56c8bb80c4e43b8a1fdaec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.830919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74a7f0b6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': 'c6144a47c7d69f825d31a1c8505cc245c013d821fad7f490d9dec7ad908f0c3b'}]}, 'timestamp': '2025-11-23 10:09:10.845876', '_unique_id': 'f87e30ebdfe24629b90f4b501cc2bb57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.848 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '320f096e-dd22-4467-9ced-2a073c55432f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.848367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74a86406-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '899616b4e36d2686f8047ca76f516d5cee0928f140b2bfe5a52e77a27a89b832'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.848367', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74a8761c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '47fc14b2fd8f167602892495a7459577e0311cb72c63e36260d4874784127a53'}]}, 'timestamp': '2025-11-23 10:09:10.849282', '_unique_id': '726bb5a44bce462397b88ffa707b92f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.851 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6174530-1ac4-4d3e-a7ad-d60d0335ee47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.851928', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a8ef02-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '221bacf79790c5cb81697acafbd618900a9fc8f775beea10449de1cec69ff410'}]}, 'timestamp': '2025-11-23 10:09:10.852410', '_unique_id': '386c6a9a530a42acb3b7482cca055262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.854 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.854 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75015848-d116-4dd4-8605-206fcc60be53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.854684', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74a95be0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '980b6bd66e80b76f05761846a05b379597e6f05b37e7aec5d7359a0c259ab405'}]}, 'timestamp': '2025-11-23 10:09:10.855199', '_unique_id': '962b6cb628fd4b9dba1ccb487a5b317c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.856 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.857 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c445cc9f-cfc7-4d1e-a3c2-07e14eb245ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:09:10.857450', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '74ac44a4-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.051080084, 'message_signature': 'b65eac0ff552f12d7b9aaf84aea93dbe3b3a2fa6b895ae15b4d6992aa340ac47'}]}, 'timestamp': '2025-11-23 10:09:10.874338', '_unique_id': 'acb7f7b625464afc98bd3c843971345b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6c10830-feab-4a4b-80ff-a152d25b84b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.877052', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74acc438-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '1b568509196e656a030f32949643259f7a17ee3de3848c3aab5fc3df12de2319'}]}, 'timestamp': '2025-11-23 10:09:10.877581', '_unique_id': '2148469aedc6490db3cc51d04214e7b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.880 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dda24e44-2c50-40b6-b245-519249754180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.880303', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74ad437c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '429d6a5df70f19f001322f1200fa1beb55787fbe1dba36c3c143f0851889325d'}]}, 'timestamp': '2025-11-23 10:09:10.880793', '_unique_id': '56baef8356c14bea89c6b6744b66d75f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.882 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93c3a3f5-5148-4598-a1b2-b87d71aa7eb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.883392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74adbbb8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': 'e11df3f07916355c95af7629e6b9a8bfbf5938f5f169d86716c67998926e4382'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.883392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74adce32-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.008587864, 'message_signature': '03ba519e9f7521726f21315546d2646a8090e738707a65a2259905f7fdf6dc9c'}]}, 'timestamp': '2025-11-23 10:09:10.884306', '_unique_id': '82c151f722054962841189e0472b358f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.885 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.886 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.886 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.913 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '856a24cb-7a89-4e08-a5d9-1bcda583e8b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.886755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b22fcc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '16793775dfaae36bb33f5bccfe40b5b7d4c0660010c21bb1f2aacd1d45cea2f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.886755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b2482c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '8850c42f846f7ea84cdf45fe720fb2feb77b7e8710e9307238d4e2c82f5be8d3'}]}, 'timestamp': '2025-11-23 10:09:10.913684', '_unique_id': 'f475d5fe816546f39d71eed3286afa5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.915 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.917 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57ade1fa-7593-40ba-8e59-0e9840c829ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.917097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b2e110-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '975c37f3df41b55a6989d5cef2c85e8336a8a7d546a9571d65ef05c6b4c50490'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.917097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b2f79a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'd7f7fab5f55970dda02982caa22116d1304d73ebac1e8805c9c4f1b2bb448a08'}]}, 'timestamp': '2025-11-23 10:09:10.918201', '_unique_id': 'a687bbf5bca54438af003ea76d849525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.919 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.920 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.921 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4bd14e3-101c-4363-b4e5-4df8f82bc4c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.920930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b37756-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'c034db4930dbd26fff653c9198feb4116ca7ff56530c0d88307ecd4bade5105b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.920930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b38d22-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'c9221814fad82b6a122fef89666e80fe48007adc578cfd1804ad730b352a90da'}]}, 'timestamp': '2025-11-23 10:09:10.922018', '_unique_id': 'c3f71d8dee7b48c3b7405835ce2c39fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.923 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1281d4d-0c9e-4163-b914-cb216c275362', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.924788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b41396-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '3aeeb77ddb35ad73d4bc583aaffe063303899ce3641a0254d6d564c59bc82469'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.924788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b42b7e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '7c93112dc8a47b0070c065d4f7ba32101d8cc706282716e4f610050cc32060a6'}]}, 'timestamp': '2025-11-23 10:09:10.926110', '_unique_id': '00ed2e286dc34cae8aee061e01ca553e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.927 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.928 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 19160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '901f8448-4523-4919-964a-9a10c8a3449f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19160000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:09:10.929129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '74b4b828-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.051080084, 'message_signature': 'd31991ddb85b33220d067d5b04520f0d861e2f0b38536c9e48f0a9e1257772a5'}]}, 'timestamp': '2025-11-23 10:09:10.929653', '_unique_id': 'a90b1e80c9dd4328a9aadfcf103ae2da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.932 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad922ec-2a2f-40e1-b41f-9078ea7b8f8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.932222', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b531b8-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': 'cb175cd43986f18a0bd6f2fadadb2f6c9848ce28937d6709472dd0960e732187'}]}, 'timestamp': '2025-11-23 10:09:10.932789', '_unique_id': 'cfe541e6826c4ef1bfbf22d712c07e1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.933 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.935 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b3f7c93-da73-4033-9592-0cc9d382c56b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.935292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b5a74c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'e44d4925c3793e27aa56fc1d60a632a5ed27f54803d4e0f5fb36d75df117920f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.935292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b5bbe2-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': '89d1168bf4bd01c8fce6fbc922b4248a2f105e5ad0f28ea8e0780ae94d2d99a0'}]}, 'timestamp': '2025-11-23 10:09:10.936287', '_unique_id': '6e700391597242718a5ac47d2d9acffe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.937 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.938 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.939 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a74ef4f-a9f2-4d48-9c62-070bc54ab63f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.939122', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b63d06-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '930ccd4bc1201414ec7cb9aaf860918fa7771719a60d1ff494959738a266e79b'}]}, 'timestamp': '2025-11-23 10:09:10.939618', '_unique_id': 'e96f68bb3e2142d2b91a96b5f88f92ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.940 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.942 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '637b73e3-e30a-4d35-8556-f65697835a55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:09:10.942380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '74b6bc2c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'dce375e97a805a841e3b1d086f3381e10a0480d39f27030ea95ba46ed0531977'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:09:10.942380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '74b6d0e0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12494.064405495, 'message_signature': 'eed851b50249efdb75074f112dce89ea9a6f03c268f6aa4cab493db68ad32df4'}]}, 'timestamp': '2025-11-23 10:09:10.943379', '_unique_id': '2d5d04b18d4343edbf341525f56ad6d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.946 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80128c08-c122-4e5d-a878-d53208dd6fd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:09:10.946072', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '74b74caa-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12493.991232381, 'message_signature': '2d99af144745d7427ce0c80fdc622256b32564875b1e9736c8626df796afcbc5'}]}, 'timestamp': '2025-11-23 10:09:10.946652', '_unique_id': 'a1a4bc49764a4ad282cb35b2a21e899e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:09:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:09:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:09:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:09:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:09:11 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:09:11 localhost podman[240668]: time="2025-11-23T10:09:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:09:11 localhost podman[240668]: @ - - [23/Nov/2025:10:09:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:09:11 localhost podman[240668]: @ - - [23/Nov/2025:10:09:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1" Nov 23 05:09:13 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:13 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:13 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.097 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.130 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:14 localhost nova_compute[281952]: 2025-11-23 10:09:14.131 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:09:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:14 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:09:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:09:16 localhost systemd[1]: tmp-crun.piUVU2.mount: Deactivated successfully. Nov 23 05:09:16 localhost podman[335964]: 2025-11-23 10:09:16.037168858 +0000 UTC m=+0.091243159 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:09:16 localhost podman[335964]: 2025-11-23 10:09:16.074495843 +0000 UTC m=+0.128570234 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:09:16 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:09:16 localhost podman[335965]: 2025-11-23 10:09:16.086409542 +0000 UTC m=+0.136444802 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 23 05:09:16 localhost podman[335965]: 2025-11-23 10:09:16.099310391 +0000 UTC m=+0.149345651 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=edpm) Nov 23 05:09:16 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:09:16 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:16 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:09:16 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:09:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 23 05:09:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 23 05:09:17 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.132 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.176 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:19 localhost nova_compute[281952]: 2025-11-23 10:09:19.177 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:20 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1431575460", "caps": ["mds", "allow rw path=/volumes/_nogroup/561ab685-42f2-4920-b91d-2420296f93f0/e7620f1a-9169-43a2-9a7b-ca15e95e1997", "osd", "allow rw pool=manila_data namespace=fsvolumens_561ab685-42f2-4920-b91d-2420296f93f0", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:21 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:09:21 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:21 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.231 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.231 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.232 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.651 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.652 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.652 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:09:23 localhost nova_compute[281952]: 2025-11-23 10:09:23.653 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.160 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.178 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.182 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.183 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.183 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.186 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:24 localhost nova_compute[281952]: 2025-11-23 10:09:24.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1431575460", "format": "json"} : dispatch Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"} : dispatch Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1431575460"}]': finished Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:09:24 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:09:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.242 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.243 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.244 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.244 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.245 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:09:25 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:09:25 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2374868706' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.683 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.761 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:09:25 localhost nova_compute[281952]: 2025-11-23 10:09:25.762 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.012 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.015 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11047MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.015 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.016 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.093 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.094 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.114 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing inventories for resource provider dae70d62-10f4-474c-9782-8c926a3641d5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.142 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating ProviderTree inventory for provider dae70d62-10f4-474c-9782-8c926a3641d5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.143 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Updating inventory in ProviderTree for provider dae70d62-10f4-474c-9782-8c926a3641d5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.168 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing aggregate associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.192 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Refreshing trait associations for resource provider dae70d62-10f4-474c-9782-8c926a3641d5, traits: COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,HW_CPU_X86_SSE42,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AESNI,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,HW_CPU_X86_MMX _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.247 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:09:26 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:09:26 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/610067357' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.713 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.719 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.738 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.740 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:09:26 localhost nova_compute[281952]: 2025-11-23 10:09:26.740 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:09:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:09:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:27 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow r pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:27 localhost nova_compute[281952]: 2025-11-23 10:09:27.735 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:27 localhost nova_compute[281952]: 2025-11-23 10:09:27.736 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:27 localhost nova_compute[281952]: 2025-11-23 10:09:27.736 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:27 localhost nova_compute[281952]: 2025-11-23 10:09:27.737 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:09:28 localhost systemd[1]: tmp-crun.SkKjzu.mount: Deactivated successfully. Nov 23 05:09:28 localhost podman[336050]: 2025-11-23 10:09:28.038393823 +0000 UTC m=+0.087703463 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 23 05:09:28 localhost podman[336052]: 2025-11-23 10:09:28.062281674 +0000 UTC m=+0.103662754 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Nov 23 05:09:28 localhost podman[336052]: 2025-11-23 10:09:28.148301785 +0000 UTC m=+0.189682795 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-type=git, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350) Nov 23 05:09:28 localhost podman[336050]: 2025-11-23 10:09:28.160212064 +0000 UTC m=+0.209521663 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller) Nov 23 05:09:28 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:09:28 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:09:28 localhost podman[336051]: 2025-11-23 10:09:28.1521176 +0000 UTC m=+0.197880973 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:09:28 localhost nova_compute[281952]: 2025-11-23 10:09:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:28 localhost podman[336051]: 2025-11-23 10:09:28.231933733 +0000 UTC m=+0.277697146 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:09:28 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.185 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.187 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:29 localhost nova_compute[281952]: 2025-11-23 10:09:29.221 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:29 localhost openstack_network_exporter[242668]: ERROR 10:09:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:09:29 localhost openstack_network_exporter[242668]: ERROR 10:09:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:09:29 localhost openstack_network_exporter[242668]: ERROR 10:09:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:09:29 localhost openstack_network_exporter[242668]: ERROR 10:09:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:09:29 localhost openstack_network_exporter[242668]: Nov 23 05:09:29 localhost openstack_network_exporter[242668]: ERROR 10:09:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:09:29 localhost openstack_network_exporter[242668]: Nov 23 05:09:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 23 05:09:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 23 05:09:31 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.222 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.224 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.225 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.262 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:34 localhost nova_compute[281952]: 2025-11-23 10:09:34.263 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 23 05:09:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:09:34 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:09:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:09:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:09:35 localhost podman[336112]: 2025-11-23 10:09:35.025121034 +0000 UTC m=+0.077028831 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:09:35 localhost podman[336112]: 2025-11-23 10:09:35.036330512 +0000 UTC m=+0.088238279 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:09:35 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:09:35 localhost podman[336111]: 2025-11-23 10:09:35.086161133 +0000 UTC m=+0.135797031 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:09:35 localhost podman[336111]: 2025-11-23 10:09:35.097456523 +0000 UTC m=+0.147092671 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.build-date=20251118, managed_by=edpm_ansible, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:09:35 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:09:37 localhost nova_compute[281952]: 2025-11-23 10:09:37.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:09:38 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e258 e258: 6 total, 6 up, 6 in Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.264 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.266 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.298 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:39 localhost nova_compute[281952]: 2025-11-23 10:09:39.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:41 localhost podman[240668]: time="2025-11-23T10:09:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:09:41 localhost podman[240668]: @ - - [23/Nov/2025:10:09:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:09:41 localhost podman[240668]: @ - - [23/Nov/2025:10:09:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1" Nov 23 05:09:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 23 05:09:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6,allow rw path=/volumes/_nogroup/6734401e-5573-4709-85e4-c69140f6c86e/7aacd64e-4c3a-45c6-ac2d-a36c3c0cca66", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4,allow rw pool=manila_data namespace=fsvolumens_6734401e-5573-4709-85e4-c69140f6c86e"]} : dispatch Nov 23 05:09:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e259 e259: 6 total, 6 up, 6 in Nov 23 05:09:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6,allow rw path=/volumes/_nogroup/6734401e-5573-4709-85e4-c69140f6c86e/7aacd64e-4c3a-45c6-ac2d-a36c3c0cca66", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4,allow rw pool=manila_data namespace=fsvolumens_6734401e-5573-4709-85e4-c69140f6c86e"]}]': finished Nov 23 05:09:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.299 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.301 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.302 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.339 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:44 localhost nova_compute[281952]: 2025-11-23 10:09:44.340 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 23 05:09:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4"]} : dispatch Nov 23 05:09:46 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/c01bfd1f-c2a4-4505-b9fb-5784100d38f4/80c7ebbb-31aa-44c5-8825-14a5c437eff6", "osd", "allow rw pool=manila_data namespace=fsvolumens_c01bfd1f-c2a4-4505-b9fb-5784100d38f4"]}]': finished Nov 23 05:09:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:09:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:09:47 localhost systemd[1]: tmp-crun.svnan1.mount: Deactivated successfully. Nov 23 05:09:47 localhost podman[336153]: 2025-11-23 10:09:47.087653618 +0000 UTC m=+0.093947261 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:09:47 localhost podman[336154]: 2025-11-23 10:09:47.128447204 +0000 UTC m=+0.131400814 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:09:47 localhost podman[336153]: 2025-11-23 10:09:47.146637484 +0000 UTC m=+0.152931117 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:09:47 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:09:47 localhost podman[336154]: 2025-11-23 10:09:47.161880153 +0000 UTC m=+0.164833773 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible) Nov 23 05:09:47 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:09:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e260 e260: 6 total, 6 up, 6 in Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.341 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.343 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.386 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:49 localhost nova_compute[281952]: 2025-11-23 10:09:49.387 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:09:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 23 05:09:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 23 05:09:50 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Nov 23 05:09:51 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:09:51 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:09:53 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e261 e261: 6 total, 6 up, 6 in Nov 23 05:09:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:53.517 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:09:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:53.519 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:09:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:09:53.519 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:09:53 localhost nova_compute[281952]: 2025-11-23 10:09:53.550 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:54 localhost nova_compute[281952]: 2025-11-23 10:09:54.388 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:54 localhost nova_compute[281952]: 2025-11-23 10:09:54.389 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:09:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e262 e262: 6 total, 6 up, 6 in Nov 23 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:09:59 localhost podman[336282]: 2025-11-23 10:09:59.043974525 +0000 UTC m=+0.096435498 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:09:59 localhost podman[336283]: 2025-11-23 10:09:59.087214866 +0000 UTC m=+0.135907254 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7) Nov 23 05:09:59 localhost podman[336281]: 2025-11-23 10:09:59.129861008 +0000 UTC m=+0.183554630 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 05:09:59 localhost podman[336282]: 2025-11-23 10:09:59.156193038 +0000 UTC m=+0.208654011 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:09:59 localhost podman[336283]: 2025-11-23 10:09:59.180751914 +0000 UTC m=+0.229444292 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter) Nov 23 05:09:59 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:09:59 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:09:59 localhost podman[336281]: 2025-11-23 10:09:59.248145378 +0000 UTC m=+0.301838970 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:09:59 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:09:59 localhost nova_compute[281952]: 2025-11-23 10:09:59.390 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:59 localhost nova_compute[281952]: 2025-11-23 10:09:59.395 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:09:59 localhost sshd[336342]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:09:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:09:59 localhost openstack_network_exporter[242668]: ERROR 10:09:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:09:59 localhost openstack_network_exporter[242668]: ERROR 10:09:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:09:59 localhost openstack_network_exporter[242668]: ERROR 10:09:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:09:59 localhost openstack_network_exporter[242668]: ERROR 10:09:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:09:59 localhost openstack_network_exporter[242668]: Nov 23 05:09:59 localhost openstack_network_exporter[242668]: ERROR 10:09:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:09:59 localhost openstack_network_exporter[242668]: Nov 23 05:10:00 localhost ceph-mon[300199]: overall HEALTH_OK Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.440354) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602440378, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2770, "num_deletes": 256, "total_data_size": 3083694, "memory_usage": 3133792, "flush_reason": "Manual Compaction"} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602449421, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1997751, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32968, "largest_seqno": 35733, "table_properties": {"data_size": 1987130, "index_size": 6490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27052, "raw_average_key_size": 22, "raw_value_size": 1963988, "raw_average_value_size": 1608, "num_data_blocks": 281, "num_entries": 1221, "num_filter_entries": 1221, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892480, "oldest_key_time": 1763892480, "file_creation_time": 1763892602, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 9120 microseconds, and 3763 cpu microseconds. Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.449469) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1997751 bytes OK Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.449488) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451538) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451552) EVENT_LOG_v1 {"time_micros": 1763892602451548, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.451568) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3070480, prev total WAL file size 3070480, number of live WAL files 2. Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.452093) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end) Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1950KB)], [57(17MB)] Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602452124, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 20452859, "oldest_snapshot_seqno": -1} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 14366 keys, 18624489 bytes, temperature: kUnknown Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602518234, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18624489, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18539868, "index_size": 47592, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 385495, "raw_average_key_size": 26, "raw_value_size": 18293034, "raw_average_value_size": 1273, "num_data_blocks": 1778, "num_entries": 14366, "num_filter_entries": 14366, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892602, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.518619) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18624489 bytes Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.521276) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 308.7 rd, 281.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 17.6 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(19.6) write-amplify(9.3) OK, records in: 14899, records dropped: 533 output_compression: NoCompression Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.521303) EVENT_LOG_v1 {"time_micros": 1763892602521291, "job": 34, "event": "compaction_finished", "compaction_time_micros": 66258, "compaction_time_cpu_micros": 33290, "output_level": 6, "num_output_files": 1, "total_output_size": 18624489, "num_input_records": 14899, "num_output_records": 14366, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602521707, "job": 34, "event": "table_file_deletion", "file_number": 59} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892602524255, "job": 34, "event": "table_file_deletion", "file_number": 57} Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.452052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524378) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524381) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:02 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:10:02.524384) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:10:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e263 e263: 6 total, 6 up, 6 in Nov 23 05:10:04 localhost nova_compute[281952]: 2025-11-23 10:10:04.421 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:10:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:10:06 localhost systemd[1]: tmp-crun.6mUJxt.mount: Deactivated successfully. Nov 23 05:10:06 localhost podman[336344]: 2025-11-23 10:10:06.04225159 +0000 UTC m=+0.098963406 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 23 05:10:06 localhost podman[336345]: 2025-11-23 10:10:06.08352259 +0000 UTC m=+0.137035108 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:10:06 localhost podman[336344]: 2025-11-23 10:10:06.103708361 +0000 UTC m=+0.160420187 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 23 05:10:06 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:10:06 localhost podman[336345]: 2025-11-23 10:10:06.119161737 +0000 UTC m=+0.172674245 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:10:06 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:10:08 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e264 e264: 6 total, 6 up, 6 in Nov 23 05:10:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:10:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:10:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:10:09.307 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:10:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:10:09.308 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.423 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.425 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.426 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.457 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:09 localhost nova_compute[281952]: 2025-11-23 10:10:09.458 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:11 localhost podman[240668]: time="2025-11-23T10:10:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:10:11 localhost podman[240668]: @ - - [23/Nov/2025:10:10:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:10:11 localhost podman[240668]: @ - - [23/Nov/2025:10:10:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18788 "" "Go-http-client/1.1" Nov 23 05:10:13 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e265 e265: 6 total, 6 up, 6 in Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.459 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.491 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.492 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:14 localhost nova_compute[281952]: 2025-11-23 10:10:14.493 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:15 localhost sshd[336390]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:10:15 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 23 05:10:15 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/346654a1-8043-457a-94b6-3b076c21a1d5/cb129bf7-168e-4c36-9f32-82b35d885736", "osd", "allow rw pool=manila_data namespace=fsvolumens_346654a1-8043-457a-94b6-3b076c21a1d5", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:10:15 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/346654a1-8043-457a-94b6-3b076c21a1d5/cb129bf7-168e-4c36-9f32-82b35d885736", "osd", "allow rw pool=manila_data namespace=fsvolumens_346654a1-8043-457a-94b6-3b076c21a1d5", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:10:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:10:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:10:18 localhost systemd[1]: tmp-crun.kYhvw4.mount: Deactivated successfully. Nov 23 05:10:18 localhost podman[336392]: 2025-11-23 10:10:18.049810712 +0000 UTC m=+0.100490223 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 23 05:10:18 localhost systemd[1]: tmp-crun.mT7iwP.mount: Deactivated successfully. Nov 23 05:10:18 localhost podman[336393]: 2025-11-23 10:10:18.092005701 +0000 UTC m=+0.136314696 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Nov 23 05:10:18 localhost podman[336393]: 2025-11-23 10:10:18.100077579 +0000 UTC m=+0.144386554 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:10:18 localhost podman[336392]: 2025-11-23 10:10:18.113993638 +0000 UTC m=+0.164673219 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:10:18 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:10:18 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:10:18 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e266 e266: 6 total, 6 up, 6 in Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.494 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.496 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:19 localhost nova_compute[281952]: 2025-11-23 10:10:19.520 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 e267: 6 total, 6 up, 6 in Nov 23 05:10:23 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:10:24 localhost ceph-mgr[288287]: client.0 ms_handle_reset on v2:172.18.0.106:6810/2037590349 Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.403 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.403 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.404 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.404 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.521 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.523 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.571 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.572 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.844 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.866 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.866 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.867 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:24 localhost nova_compute[281952]: 2025-11-23 10:10:24.868 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:25 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-272049507", "format": "json"} : dispatch Nov 23 05:10:26 localhost nova_compute[281952]: 2025-11-23 10:10:26.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:26 localhost nova_compute[281952]: 2025-11-23 10:10:26.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:26 localhost nova_compute[281952]: 2025-11-23 10:10:26.215 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:26 localhost nova_compute[281952]: 2025-11-23 10:10:26.216 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:10:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-272049507", "caps": ["mds", "allow rw path=/volumes/_nogroup/b9ef5055-ce0d-4f29-b449-58f39c1f00af/ad7774b9-dfc0-45be-aa87-07b19521a8eb", "osd", "allow rw pool=manila_data namespace=fsvolumens_b9ef5055-ce0d-4f29-b449-58f39c1f00af", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:10:26 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-272049507", "caps": ["mds", "allow rw path=/volumes/_nogroup/b9ef5055-ce0d-4f29-b449-58f39c1f00af/ad7774b9-dfc0-45be-aa87-07b19521a8eb", "osd", "allow rw pool=manila_data namespace=fsvolumens_b9ef5055-ce0d-4f29-b449-58f39c1f00af", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.253 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.254 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.254 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.255 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.255 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:10:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:10:27 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1336387364' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.711 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.796 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:10:27 localhost nova_compute[281952]: 2025-11-23 10:10:27.797 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.027 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.028 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11035MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.029 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.118 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.184 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:10:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:10:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2277080277' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.661 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.669 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.703 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.706 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:10:28 localhost nova_compute[281952]: 2025-11-23 10:10:28.706 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:10:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:29 localhost nova_compute[281952]: 2025-11-23 10:10:29.570 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:29 localhost nova_compute[281952]: 2025-11-23 10:10:29.706 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:29 localhost nova_compute[281952]: 2025-11-23 10:10:29.707 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:10:29 localhost openstack_network_exporter[242668]: ERROR 10:10:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:10:29 localhost openstack_network_exporter[242668]: ERROR 10:10:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:10:29 localhost openstack_network_exporter[242668]: ERROR 10:10:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:10:29 localhost openstack_network_exporter[242668]: ERROR 10:10:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:10:29 localhost openstack_network_exporter[242668]: Nov 23 05:10:29 localhost openstack_network_exporter[242668]: ERROR 10:10:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:10:29 localhost openstack_network_exporter[242668]: Nov 23 05:10:30 localhost podman[336477]: 2025-11-23 10:10:30.043452276 +0000 UTC m=+0.095316214 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:10:30 localhost podman[336479]: 2025-11-23 10:10:30.108971371 +0000 UTC m=+0.154161265 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 23 05:10:30 localhost podman[336477]: 2025-11-23 10:10:30.12841508 +0000 UTC m=+0.180278988 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:10:30 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:10:30 localhost podman[336478]: 2025-11-23 10:10:30.148130276 +0000 UTC m=+0.199576022 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true) Nov 23 05:10:30 localhost podman[336478]: 2025-11-23 10:10:30.182527056 +0000 UTC m=+0.233972812 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2) Nov 23 05:10:30 localhost podman[336479]: 2025-11-23 10:10:30.19436331 +0000 UTC m=+0.239553194 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 05:10:30 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:10:30 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:10:31 localhost systemd[1]: tmp-crun.XLMcqY.mount: Deactivated successfully. Nov 23 05:10:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-272049507", "format": "json"} : dispatch Nov 23 05:10:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-272049507"} : dispatch Nov 23 05:10:32 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-272049507"}]': finished Nov 23 05:10:32 localhost sshd[336540]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:10:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.573 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.575 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.619 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:34 localhost nova_compute[281952]: 2025-11-23 10:10:34.620 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 23 05:10:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 23 05:10:36 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Nov 23 05:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:10:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:10:37 localhost podman[336542]: 2025-11-23 10:10:37.047144928 +0000 UTC m=+0.090034481 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:10:37 localhost podman[336542]: 2025-11-23 10:10:37.058177048 +0000 UTC m=+0.101066621 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251118) Nov 23 05:10:37 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:10:37 localhost podman[336543]: 2025-11-23 10:10:37.14761652 +0000 UTC m=+0.188836911 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:10:37 localhost podman[336543]: 2025-11-23 10:10:37.156366589 +0000 UTC m=+0.197586940 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:10:37 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:10:39 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Nov 23 05:10:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.621 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.623 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.623 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.624 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:39 localhost nova_compute[281952]: 2025-11-23 10:10:39.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:41 localhost podman[240668]: time="2025-11-23T10:10:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:10:41 localhost podman[240668]: @ - - [23/Nov/2025:10:10:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:10:41 localhost podman[240668]: @ - - [23/Nov/2025:10:10:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18784 "" "Go-http-client/1.1" Nov 23 05:10:42 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 23 05:10:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/73463559-de38-4e65-91fe-e256e1993ef1/5697482a-5566-4614-b3e6-cbfdb66450d2", "osd", "allow rw pool=manila_data namespace=fsvolumens_73463559-de38-4e65-91fe-e256e1993ef1", "mon", "allow r"], "format": "json"} : dispatch Nov 23 05:10:43 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/73463559-de38-4e65-91fe-e256e1993ef1/5697482a-5566-4614-b3e6-cbfdb66450d2", "osd", "allow rw pool=manila_data namespace=fsvolumens_73463559-de38-4e65-91fe-e256e1993ef1", "mon", "allow r"], "format": "json"}]': finished Nov 23 05:10:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:44 localhost nova_compute[281952]: 2025-11-23 10:10:44.652 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:10:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:10:49 localhost podman[336586]: 2025-11-23 10:10:49.019430784 +0000 UTC m=+0.073701920 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 23 05:10:49 localhost podman[336586]: 2025-11-23 10:10:49.032740853 +0000 UTC m=+0.087011929 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:10:49 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:10:49 localhost podman[336585]: 2025-11-23 10:10:49.07551993 +0000 UTC m=+0.130234560 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:10:49 localhost podman[336585]: 2025-11-23 10:10:49.087305032 +0000 UTC m=+0.142019662 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:10:49 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:10:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:49 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.654 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.656 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.656 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.657 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.689 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:49 localhost nova_compute[281952]: 2025-11-23 10:10:49.690 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:52 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:10:52 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:10:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.691 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.693 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.694 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.727 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:54 localhost nova_compute[281952]: 2025-11-23 10:10:54.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:10:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 23 05:10:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 23 05:10:56 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Nov 23 05:10:58 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3. Nov 23 05:10:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:10:59 localhost nova_compute[281952]: 2025-11-23 10:10:59.728 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:10:59 localhost openstack_network_exporter[242668]: ERROR 10:10:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:10:59 localhost openstack_network_exporter[242668]: ERROR 10:10:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:10:59 localhost openstack_network_exporter[242668]: ERROR 10:10:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:10:59 localhost openstack_network_exporter[242668]: ERROR 10:10:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:10:59 localhost openstack_network_exporter[242668]: Nov 23 05:10:59 localhost openstack_network_exporter[242668]: ERROR 10:10:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:10:59 localhost openstack_network_exporter[242668]: Nov 23 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:11:01 localhost podman[336712]: 2025-11-23 10:11:01.04473959 +0000 UTC m=+0.092720344 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 23 05:11:01 localhost podman[336713]: 2025-11-23 10:11:01.132805731 +0000 UTC m=+0.176167012 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:11:01 localhost podman[336714]: 2025-11-23 10:11:01.144015476 +0000 UTC m=+0.185007314 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 23 05:11:01 localhost podman[336714]: 2025-11-23 10:11:01.158611535 +0000 UTC m=+0.199603433 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm) Nov 23 05:11:01 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:11:01 localhost podman[336712]: 2025-11-23 10:11:01.215104683 +0000 UTC m=+0.263085497 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 23 05:11:01 localhost podman[336713]: 2025-11-23 10:11:01.21629469 +0000 UTC m=+0.259655951 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:11:01 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:11:01 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:11:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:04 localhost sshd[336771]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.732 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.734 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.735 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.735 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.772 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:04 localhost nova_compute[281952]: 2025-11-23 10:11:04.773 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:05 localhost sshd[336773]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:11:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:11:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:11:08 localhost podman[336775]: 2025-11-23 10:11:08.054680533 +0000 UTC m=+0.096293275 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0) Nov 23 05:11:08 localhost podman[336775]: 2025-11-23 10:11:08.091577188 +0000 UTC m=+0.133189940 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:11:08 localhost podman[336776]: 2025-11-23 10:11:08.104738953 +0000 UTC m=+0.141468584 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:11:08 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:11:08 localhost podman[336776]: 2025-11-23 10:11:08.113441101 +0000 UTC m=+0.150170772 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:11:08 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:11:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:11:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:11:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:09.311 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:11:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.774 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.776 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.777 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.822 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:09 localhost nova_compute[281952]: 2025-11-23 10:11:09.824 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.811 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.812 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.818 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c044dc1e-d845-4631-87f1-40c48b89fe62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.813126', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2a73dc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': 'a2cae6bdfd2042323766e92e7143c13940e5c696f555028865243072804b9178'}]}, 'timestamp': '2025-11-23 10:11:10.819531', '_unique_id': 'ce3ce53f917d4e55bf88b16107e18d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.821 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.822 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.822 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d8c91f8-1b4e-4acf-96b8-c56b460c26ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.822646', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2b0928-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '192a7ac019ce8961884fda3054162537e2b84ae805e8f46d1452d0e8d1cb82ee'}]}, 'timestamp': '2025-11-23 10:11:10.823401', '_unique_id': 'd3e5e169446f4be9ae1934b34358663e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.824 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.826 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '128fe7a0-f909-4478-b807-da52d4bd867f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.826143', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc2b8c2c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '1854dbd02928e47a609ec330009c0af8a86e5968a7c7eb78c27903241b6c1a12'}]}, 'timestamp': '2025-11-23 10:11:10.826644', '_unique_id': '76dd963a44bf44adb0a79d672002c5e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.827 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.866 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.867 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8ba88a-0dc7-4eca-ad5f-2f2405227d35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.828953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc31c51a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '16a0111bc600817b72c5da6b35cb85b571ce422fe68c0c57381daffb2ec8c0c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.828953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc31d7da-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f71a2746f57edd5a1ee91f19704d888aed8ac9f20cafba7b6bc4d428cfce2a9c'}]}, 'timestamp': '2025-11-23 10:11:10.867935', '_unique_id': 'a6718563f3a24397b47dafab14ce8757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.870 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.870 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ec4350b-2531-4909-a1cb-0bdc6a41b23d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.870698', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc32596c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '0b39fa9c9bba480867327869de48275c63dbd1f07b26064b77a8011a23949db0'}]}, 'timestamp': '2025-11-23 10:11:10.871345', '_unique_id': '52b35bc914c646abb170efff0b13e0f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.873 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d8c5721-48d7-46cf-96e5-1f9efeee0e01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.873703', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc32cdfc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': 'd36c17a32c031d2ab617009bf305d4f9c38a317375843a24f5bddb73cb4b9233'}]}, 'timestamp': '2025-11-23 10:11:10.874197', '_unique_id': '02d58bc412974dbc8cb25f48409cf7e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f4d012c-3f72-423e-9e19-9c96df8a3305', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.876394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc333620-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'c84478cddd61d74e47f0a7844471f199915000dc58cdf0058531410c239bfe0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.876394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc334836-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '1e24acd1784bfe66d91eef2df82d0b6e71d1b2c24aacc62fba4b4579a40e36c3'}]}, 'timestamp': '2025-11-23 10:11:10.877295', '_unique_id': 'a614cd4b9dbe49559361a7932869526f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.880 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '587cb7a5-45fd-43b5-9ec5-228105158d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.879517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc33b0e6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'e0c1666708b5824bedbea23259218f887aa53edb200e2e30866925598edff8e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.879517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc33c31a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'e89fc9353d299b03062bf8372eacf6655a7c733f60b1d711315d982bfd255bab'}]}, 'timestamp': '2025-11-23 10:11:10.880436', '_unique_id': '9d825c5a624948919a2a0b83d1b47783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.882 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.883 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7072318-00ce-47bc-87b9-22bd52c3b39b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.882635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3429d6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '857a9704127f035d837f9b8a6cc52ffd413f842b39fda9a0b25a86460a7abaae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.882635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc343bce-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'cdaa17316256b60f96b542ec58d5d8f285430779072da27e263ad627170e1de7'}]}, 'timestamp': '2025-11-23 10:11:10.883528', '_unique_id': '3c37f7593c684f3cbcc372f326206a81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.884 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.885 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.885 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c464c1f6-eb46-404d-8b02-bd412a265a88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.885766', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc34a596-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '37a0738185c9be8ada3bf82d83f14308d9770ff60cf1c2231d9f6434139d7ada'}]}, 'timestamp': '2025-11-23 10:11:10.886268', '_unique_id': 'b6ec2a1511bf454c8b3e4ba18706d224'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.887 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.888 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58e19235-d0df-41ae-95f0-dc23ee9affbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:11:10.888445', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bc37f2be-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.084885156, 'message_signature': '45b186f6a671743d39d0a5c42d161a72cf5555ad4f99781ca762bb13fe713d6b'}]}, 'timestamp': '2025-11-23 10:11:10.907970', '_unique_id': '8876dc64b5384674ae523f27fc146896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.909 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.910 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36b760e8-4e37-4887-99a5-7782d78d46cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.910351', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc38649c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '10e9a687abddb9c870feb54242fab2b6408777b6225fd3060c90a3203b297578'}]}, 'timestamp': '2025-11-23 10:11:10.911407', '_unique_id': '834744a332ea4977b0c779d05e1971d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.912 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.924 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.925 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '917fe17b-83d5-490d-80af-31598967a2d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.913592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3aa040-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '5c31d5599f8411af3a14b160757b829691f2de204d2abfed28edd71c6046e921'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.913592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3ab116-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'b6dc8d7a337fe99cd4a7f6745cac3085d457c3d6374d9de9f30e28a7c5d6c11a'}]}, 'timestamp': '2025-11-23 10:11:10.925868', '_unique_id': '7d4ffb2c310044ceae54b78bf241dd93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.926 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.928 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0971faa6-2a75-4467-b95b-ddb1216b7357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.928251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3b2038-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '5e8d6067c23da7852dbeece06c2da138c57424e50191d35e3bf9350d7074fc45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.928251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3b31e0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'ef831a7d1f384d216e1d65d287b3af494c80ab324ddbd58dd09e09d25be7a9dc'}]}, 'timestamp': '2025-11-23 10:11:10.929148', '_unique_id': '3a2725e2104a4abd8e7ab6b397e9c590'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.930 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.931 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7aeb4e-ca21-43e2-bae9-7ef6950ed457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.931346', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3b9874-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '45267dc2dc54b962024fd2642246b398dfb22388af7b8c92cb131ef37bd82af0'}]}, 'timestamp': '2025-11-23 10:11:10.931857', '_unique_id': '1895e26ce9bd4372a7227bb86a50e2b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.932 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.933 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.934 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6495e213-2cd4-4a91-8d85-32feec9dfcfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.934151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3c064c-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f1d6005e5afd51ecfe7a68170857f9f8208077339395dcf6261fd9fc6a0f0bc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.934151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3c1650-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '7ed75ce2e20bd4fe9aab38114ed780f3d8ba99762f209294433cdd56111938af'}]}, 'timestamp': '2025-11-23 10:11:10.935024', '_unique_id': 'e5bfa5ecc78c42359426ae6c5217bb44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.936 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.937 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bed0f083-b632-4bd2-a93a-af7dbd526518', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.937265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3c7fa0-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': 'f28c6465d85e756f774119a99ea631324e980ec43d1dc6f317bbf64c2adeeff8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.937265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3c90c6-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.091225791, 'message_signature': '364706e242644de2fcdb044d38b32b5aea2f62da142cda3ea4525307343c8373'}]}, 'timestamp': '2025-11-23 10:11:10.938130', '_unique_id': '70aef3d6a6734082be90d3294d0334eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.939 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.940 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 19800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3cb9b0-b08c-4739-a62a-bf0fb51dc619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19800000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:11:10.940299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bc3cf5de-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.084885156, 'message_signature': '5ea559a8cda5c1f6579c628f8d5b229e60fb0e13be435889d6e215092b8361cb'}]}, 'timestamp': '2025-11-23 10:11:10.940730', '_unique_id': '15ceb85f82c94335a80080b9d153614e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.941 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.943 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77b52242-8537-457b-b517-6ace28a10b2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.943111', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3d63fc-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '4ed8f5adb2133e2f2bf30bed60f9cc06f5d739d82a6410a98e2ab0110fcc3877'}]}, 'timestamp': '2025-11-23 10:11:10.943566', '_unique_id': '2563a0ecbb234574a2bfb2d05f185195'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.944 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.945 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3dad57f-423f-4d8b-852a-84356e72cb95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:11:10.945812', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': 'bc3dcf0e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12613.99078324, 'message_signature': '7eef0674d979818729c3be9b08fa5b9c60ad916e7ce81703ec4a95c888b1f8d1'}]}, 'timestamp': '2025-11-23 10:11:10.946304', '_unique_id': 'c63e9a162e454d57999ddba11a6e6d5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.947 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.948 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.949 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4157d870-e58a-48ff-b698-e6549f2d9a1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:11:10.948593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bc3e3a2a-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': 'f2cc11f272a073b9adec58940189a6e67a3c0fc62c420a26b93b24e139a10228'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:11:10.948593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bc3e4b6e-c854-11f0-bde4-fa163e72a351', 'monotonic_time': 12614.006595347, 'message_signature': '74846397c0385369cb5eec45262130f6e522ca7dbe935ae2265a9350e26cd7a4'}]}, 'timestamp': '2025-11-23 10:11:10.949462', '_unique_id': '6406d28ee7814055912a14f0a8279747'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:11:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:11:10.950 12 ERROR oslo_messaging.notify.messaging Nov 23 05:11:11 localhost podman[240668]: time="2025-11-23T10:11:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:11:11 localhost podman[240668]: @ - - [23/Nov/2025:10:11:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:11:11 localhost podman[240668]: @ - - [23/Nov/2025:10:11:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18792 "" "Go-http-client/1.1" Nov 23 05:11:12 localhost sshd[336814]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.520221) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673520281, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 1388, "num_deletes": 255, "total_data_size": 1910281, "memory_usage": 1935664, "flush_reason": "Manual Compaction"} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673535494, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 1008116, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35738, "largest_seqno": 37121, "table_properties": {"data_size": 1003172, "index_size": 2223, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 14202, "raw_average_key_size": 21, "raw_value_size": 991949, "raw_average_value_size": 1528, "num_data_blocks": 98, "num_entries": 649, "num_filter_entries": 649, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892603, "oldest_key_time": 1763892603, "file_creation_time": 1763892673, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 15343 microseconds, and 5637 cpu microseconds. Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.535563) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 1008116 bytes OK Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.535591) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537183) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537205) EVENT_LOG_v1 {"time_micros": 1763892673537197, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 1903394, prev total WAL file size 1903718, number of live WAL files 2. Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.538025) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323538' seq:72057594037927935, type:22 .. '6D6772737461740034353131' seq:0, type:0; will stop at (end) Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(984KB)], [60(17MB)] Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673538091, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 19632605, "oldest_snapshot_seqno": -1} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14508 keys, 17739657 bytes, temperature: kUnknown Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673605550, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 17739657, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17657700, "index_size": 44608, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36293, "raw_key_size": 389229, "raw_average_key_size": 26, "raw_value_size": 17412065, "raw_average_value_size": 1200, "num_data_blocks": 1652, "num_entries": 14508, "num_filter_entries": 14508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892673, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.605862) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 17739657 bytes Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.607486) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 290.7 rd, 262.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 17.8 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(37.1) write-amplify(17.6) OK, records in: 15015, records dropped: 507 output_compression: NoCompression Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.607516) EVENT_LOG_v1 {"time_micros": 1763892673607503, "job": 36, "event": "compaction_finished", "compaction_time_micros": 67534, "compaction_time_cpu_micros": 27202, "output_level": 6, "num_output_files": 1, "total_output_size": 17739657, "num_input_records": 15015, "num_output_records": 14508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673607789, "job": 36, "event": "table_file_deletion", "file_number": 62} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892673610694, "job": 36, "event": "table_file_deletion", "file_number": 60} Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.537867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610736) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:13 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:11:13.610743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:11:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:14.086 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:11:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:14.087 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:11:14 localhost nova_compute[281952]: 2025-11-23 10:11:14.129 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:14 localhost nova_compute[281952]: 2025-11-23 10:11:14.823 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:17 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:17.089 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:11:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.826 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.828 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.857 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:19 localhost nova_compute[281952]: 2025-11-23 10:11:19.858 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:11:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:11:20 localhost podman[336817]: 2025-11-23 10:11:20.019926081 +0000 UTC m=+0.074860665 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:11:20 localhost podman[336817]: 2025-11-23 10:11:20.032811837 +0000 UTC m=+0.087746431 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:11:20 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:11:20 localhost podman[336816]: 2025-11-23 10:11:20.131053611 +0000 UTC m=+0.186070678 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:11:20 localhost podman[336816]: 2025-11-23 10:11:20.168317287 +0000 UTC m=+0.223334284 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:11:20 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:11:23 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e268 e268: 6 total, 6 up, 6 in Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.355 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.356 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.356 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.357 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:11:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.820 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.848 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.849 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:11:24 localhost nova_compute[281952]: 2025-11-23 10:11:24.905 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:26 localhost nova_compute[281952]: 2025-11-23 10:11:26.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:26 localhost nova_compute[281952]: 2025-11-23 10:11:26.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:11:27 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:11:27 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1273085929' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.701 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.792 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.793 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.965 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11028MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:11:27 localhost nova_compute[281952]: 2025-11-23 10:11:27.966 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.053 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.054 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.054 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.107 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:11:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:11:28 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2430657483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.507 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.513 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.532 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:11:28 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 e269: 6 total, 6 up, 6 in Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.534 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:11:28 localhost nova_compute[281952]: 2025-11-23 10:11:28.535 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:11:29 localhost nova_compute[281952]: 2025-11-23 10:11:29.532 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:29 localhost nova_compute[281952]: 2025-11-23 10:11:29.533 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:29 localhost nova_compute[281952]: 2025-11-23 10:11:29.910 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:29 localhost openstack_network_exporter[242668]: ERROR 10:11:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:11:29 localhost openstack_network_exporter[242668]: ERROR 10:11:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:11:29 localhost openstack_network_exporter[242668]: ERROR 10:11:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:11:29 localhost openstack_network_exporter[242668]: ERROR 10:11:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:11:29 localhost openstack_network_exporter[242668]: Nov 23 05:11:29 localhost openstack_network_exporter[242668]: ERROR 10:11:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:11:29 localhost openstack_network_exporter[242668]: Nov 23 05:11:30 localhost nova_compute[281952]: 2025-11-23 10:11:30.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:11:32 localhost podman[336904]: 2025-11-23 10:11:32.04110269 +0000 UTC m=+0.093103856 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 23 05:11:32 localhost podman[336904]: 2025-11-23 10:11:32.054360868 +0000 UTC m=+0.106362034 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41) Nov 23 05:11:32 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:11:32 localhost podman[336903]: 2025-11-23 10:11:32.146506144 +0000 UTC m=+0.200241513 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 05:11:32 localhost podman[336903]: 2025-11-23 10:11:32.176471255 +0000 UTC m=+0.230206634 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 23 05:11:32 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:11:32 localhost podman[336902]: 2025-11-23 10:11:32.180920413 +0000 UTC m=+0.237264313 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:11:32 localhost podman[336902]: 2025-11-23 10:11:32.291089753 +0000 UTC m=+0.347433723 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 23 05:11:32 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:11:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.913 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.915 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.943 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:34 localhost nova_compute[281952]: 2025-11-23 10:11:34.944 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:11:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:11:39 localhost podman[336964]: 2025-11-23 10:11:39.027107177 +0000 UTC m=+0.081236811 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:11:39 localhost podman[336964]: 2025-11-23 10:11:39.038850028 +0000 UTC m=+0.092979652 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 23 05:11:39 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:11:39 localhost podman[336963]: 2025-11-23 10:11:39.132588732 +0000 UTC m=+0.187166460 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:11:39 localhost podman[336963]: 2025-11-23 10:11:39.146617634 +0000 UTC m=+0.201195362 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible) Nov 23 05:11:39 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:11:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.945 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.946 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.947 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:39 localhost nova_compute[281952]: 2025-11-23 10:11:39.998 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:11:40 localhost sshd[337006]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:11:41 localhost podman[240668]: time="2025-11-23T10:11:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:11:41 localhost podman[240668]: @ - - [23/Nov/2025:10:11:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:11:41 localhost podman[240668]: @ - - [23/Nov/2025:10:11:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1" Nov 23 05:11:43 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e270 e270: 6 total, 6 up, 6 in Nov 23 05:11:43 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:43.948 263258 INFO neutron.agent.linux.ip_lib [None req-525f7317-de4a-43be-8b86-b61be5a2a9bf - - - - - -] Device tap4117331a-ec cannot be used as it has no MAC address#033[00m Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.018 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost kernel: device tap4117331a-ec entered promiscuous mode Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.026 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost ovn_controller[154788]: 2025-11-23T10:11:44Z|00560|binding|INFO|Claiming lport 4117331a-ec62-4320-8258-283ce7293851 for this chassis. Nov 23 05:11:44 localhost ovn_controller[154788]: 2025-11-23T10:11:44Z|00561|binding|INFO|4117331a-ec62-4320-8258-283ce7293851: Claiming unknown Nov 23 05:11:44 localhost systemd-udevd[337018]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:11:44 localhost NetworkManager[5975]: [1763892704.0378] manager: (tap4117331a-ec): new Generic device (/org/freedesktop/NetworkManager/Devices/88) Nov 23 05:11:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:44.038 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35eacc49dfa4dcdab7bc1d511e6db78', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7edc718d-b211-427a-a4c0-519da72bd765, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4117331a-ec62-4320-8258-283ce7293851) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:11:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:44.040 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4117331a-ec62-4320-8258-283ce7293851 in datapath 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7 bound to our chassis#033[00m Nov 23 05:11:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:44.041 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5f799f24-c8ba-4c61-a1b4-e05c3d9d02a9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:11:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:44.041 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:11:44 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:44.042 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cd468ced-afc6-4451-af91-ca18dc72c19e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost ovn_controller[154788]: 2025-11-23T10:11:44Z|00562|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 ovn-installed in OVS Nov 23 05:11:44 localhost ovn_controller[154788]: 2025-11-23T10:11:44Z|00563|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 up in Southbound Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.064 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost journal[230249]: ethtool ioctl error on tap4117331a-ec: No such device Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.100 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.134 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost nova_compute[281952]: 2025-11-23 10:11:44.316 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:45 localhost nova_compute[281952]: 2025-11-23 10:11:44.999 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:45 localhost podman[337089]: Nov 23 05:11:45 localhost podman[337089]: 2025-11-23 10:11:45.048573753 +0000 UTC m=+0.088137143 container create 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:11:45 localhost systemd[1]: Started libpod-conmon-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope. Nov 23 05:11:45 localhost podman[337089]: 2025-11-23 10:11:45.008607893 +0000 UTC m=+0.048171313 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:11:45 localhost systemd[1]: Started libcrun container. Nov 23 05:11:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddcb664480a4fa9e4b1887eb87a9eccbf38a1928d563f5f3df15bf5dcf54f58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:11:45 localhost podman[337089]: 2025-11-23 10:11:45.135390274 +0000 UTC m=+0.174953664 container init 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:11:45 localhost podman[337089]: 2025-11-23 10:11:45.146193307 +0000 UTC m=+0.185756687 container start 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:11:45 localhost dnsmasq[337106]: started, version 2.85 cachesize 150 Nov 23 05:11:45 localhost dnsmasq[337106]: DNS service limited to local subnets Nov 23 05:11:45 localhost dnsmasq[337106]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:11:45 localhost dnsmasq[337106]: warning: no upstream servers configured Nov 23 05:11:45 localhost dnsmasq-dhcp[337106]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:11:45 localhost dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 0 addresses Nov 23 05:11:45 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host Nov 23 05:11:45 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts Nov 23 05:11:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.210 263258 INFO neutron.agent.dhcp.agent [None req-0855374f-423e-4dac-87cd-d20513f98355 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:11:44Z, description=, device_id=2dc260f8-5d48-4126-8d42-5f3396592b40, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c1256eec-8812-4d83-8b39-ce4a1c9e8997, ip_allocation=immediate, mac_address=fa:16:3e:12:96:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:11:41Z, description=, dns_domain=, id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-657383188-network, port_security_enabled=True, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3878, status=ACTIVE, subnets=['d87b1163-7cea-4d33-ab31-0f83a2c848f7'], tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:42Z, vlan_transparent=None, network_id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, port_security_enabled=False, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:44Z on network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7#033[00m Nov 23 05:11:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.299 263258 INFO neutron.agent.dhcp.agent [None req-aa144582-ed6a-4d8e-9609-05c9b75fbf90 - - - - - -] DHCP configuration for ports {'11e31315-69a4-4bf2-baf7-a726200c0c1e'} is completed#033[00m Nov 23 05:11:45 localhost dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 1 addresses Nov 23 05:11:45 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host Nov 23 05:11:45 localhost podman[337122]: 2025-11-23 10:11:45.421347444 +0000 UTC m=+0.049240176 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS) Nov 23 05:11:45 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts Nov 23 05:11:45 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:45.845 263258 INFO neutron.agent.dhcp.agent [None req-c17d2edb-f5b2-4c20-99bd-61629349aded - - - - - -] DHCP configuration for ports {'c1256eec-8812-4d83-8b39-ce4a1c9e8997'} is completed#033[00m Nov 23 05:11:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:46.385 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:11:44Z, description=, device_id=2dc260f8-5d48-4126-8d42-5f3396592b40, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c1256eec-8812-4d83-8b39-ce4a1c9e8997, ip_allocation=immediate, mac_address=fa:16:3e:12:96:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:11:41Z, description=, dns_domain=, id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-657383188-network, port_security_enabled=True, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32379, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3878, status=ACTIVE, subnets=['d87b1163-7cea-4d33-ab31-0f83a2c848f7'], tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:42Z, vlan_transparent=None, network_id=5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, port_security_enabled=False, project_id=e35eacc49dfa4dcdab7bc1d511e6db78, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3886, status=DOWN, tags=[], tenant_id=e35eacc49dfa4dcdab7bc1d511e6db78, updated_at=2025-11-23T10:11:44Z on network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7#033[00m Nov 23 05:11:46 localhost dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 1 addresses Nov 23 05:11:46 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host Nov 23 05:11:46 localhost podman[337158]: 2025-11-23 10:11:46.582183176 +0000 UTC m=+0.036278407 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 23 05:11:46 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts Nov 23 05:11:46 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:46.799 263258 INFO neutron.agent.dhcp.agent [None req-9d3015ee-edc4-4e21-909d-12fc048c4fc0 - - - - - -] DHCP configuration for ports {'c1256eec-8812-4d83-8b39-ce4a1c9e8997'} is completed#033[00m Nov 23 05:11:48 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 e271: 6 total, 6 up, 6 in Nov 23 05:11:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:50 localhost nova_compute[281952]: 2025-11-23 10:11:50.001 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:11:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:11:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:11:51 localhost podman[337179]: 2025-11-23 10:11:51.006943497 +0000 UTC m=+0.063210386 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251118, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:11:51 localhost podman[337179]: 2025-11-23 10:11:51.020579417 +0000 UTC m=+0.076846266 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute) Nov 23 05:11:51 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:11:51 localhost systemd[1]: tmp-crun.TmHOYG.mount: Deactivated successfully. Nov 23 05:11:51 localhost podman[337178]: 2025-11-23 10:11:51.068679867 +0000 UTC m=+0.126040690 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:11:51 localhost podman[337178]: 2025-11-23 10:11:51.074830626 +0000 UTC m=+0.132191489 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:11:51 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:11:52 localhost systemd[1]: tmp-crun.v1EGwy.mount: Deactivated successfully. Nov 23 05:11:52 localhost dnsmasq[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/addn_hosts - 0 addresses Nov 23 05:11:52 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/host Nov 23 05:11:52 localhost dnsmasq-dhcp[337106]: read /var/lib/neutron/dhcp/5b79f11d-bf85-4ba4-8bb8-52ec94385ea7/opts Nov 23 05:11:52 localhost podman[337287]: 2025-11-23 10:11:52.812007723 +0000 UTC m=+0.076923498 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118) Nov 23 05:11:53 localhost kernel: device tap4117331a-ec left promiscuous mode Nov 23 05:11:53 localhost nova_compute[281952]: 2025-11-23 10:11:53.002 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:53 localhost ovn_controller[154788]: 2025-11-23T10:11:53Z|00564|binding|INFO|Releasing lport 4117331a-ec62-4320-8258-283ce7293851 from this chassis (sb_readonly=0) Nov 23 05:11:53 localhost ovn_controller[154788]: 2025-11-23T10:11:53Z|00565|binding|INFO|Setting lport 4117331a-ec62-4320-8258-283ce7293851 down in Southbound Nov 23 05:11:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:53.011 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e35eacc49dfa4dcdab7bc1d511e6db78', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7edc718d-b211-427a-a4c0-519da72bd765, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4117331a-ec62-4320-8258-283ce7293851) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:11:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:53.012 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 4117331a-ec62-4320-8258-283ce7293851 in datapath 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7 unbound from our chassis#033[00m Nov 23 05:11:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:53.015 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:11:53 localhost ovn_metadata_agent[160434]: 2025-11-23 10:11:53.016 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5eb2b6-c73b-4c57-a444-9b8c975f8834]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:11:53 localhost nova_compute[281952]: 2025-11-23 10:11:53.020 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:11:53 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:11:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:11:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:11:54 localhost ovn_controller[154788]: 2025-11-23T10:11:54Z|00566|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:11:54 localhost nova_compute[281952]: 2025-11-23 10:11:54.923 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:55 localhost nova_compute[281952]: 2025-11-23 10:11:55.004 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:55 localhost systemd[1]: tmp-crun.nebXYs.mount: Deactivated successfully. Nov 23 05:11:55 localhost dnsmasq[337106]: exiting on receipt of SIGTERM Nov 23 05:11:55 localhost podman[337363]: 2025-11-23 10:11:55.307027562 +0000 UTC m=+0.066312452 container kill 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:11:55 localhost systemd[1]: libpod-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope: Deactivated successfully. Nov 23 05:11:55 localhost podman[337378]: 2025-11-23 10:11:55.375709106 +0000 UTC m=+0.053567489 container died 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:11:55 localhost podman[337378]: 2025-11-23 10:11:55.42037578 +0000 UTC m=+0.098234103 container cleanup 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:11:55 localhost systemd[1]: libpod-conmon-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757.scope: Deactivated successfully. Nov 23 05:11:55 localhost podman[337379]: 2025-11-23 10:11:55.458629557 +0000 UTC m=+0.132013763 container remove 716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b79f11d-bf85-4ba4-8bb8-52ec94385ea7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118) Nov 23 05:11:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:55.483 263258 INFO neutron.agent.dhcp.agent [None req-73031e67-7e55-4efc-80ab-1acfd1d70688 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:11:55 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:11:55.680 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:11:56 localhost systemd[1]: tmp-crun.b4iYMM.mount: Deactivated successfully. Nov 23 05:11:56 localhost systemd[1]: var-lib-containers-storage-overlay-cddcb664480a4fa9e4b1887eb87a9eccbf38a1928d563f5f3df15bf5dcf54f58-merged.mount: Deactivated successfully. Nov 23 05:11:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-716b60519dee5b877a4654afe74b3d0f5461d5ec0e05bc1addf65918caaf5757-userdata-shm.mount: Deactivated successfully. Nov 23 05:11:56 localhost systemd[1]: run-netns-qdhcp\x2d5b79f11d\x2dbf85\x2d4ba4\x2d8bb8\x2d52ec94385ea7.mount: Deactivated successfully. Nov 23 05:11:57 localhost nova_compute[281952]: 2025-11-23 10:11:57.213 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:11:57 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 4849 writes, 37K keys, 4849 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s#012Cumulative WAL: 4849 writes, 4849 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2570 writes, 13K keys, 2570 commit groups, 1.0 writes per commit group, ingest: 19.64 MB, 0.03 MB/s#012Interval WAL: 2570 writes, 2570 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 170.0 0.25 0.11 18 0.014 0 0 0.0 0.0#012 L6 1/0 16.92 MB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 6.5 219.3 200.6 1.41 0.76 17 0.083 222K 8796 0.0 0.0#012 Sum 1/0 16.92 MB 0.0 0.3 0.0 0.3 0.3 0.1 0.0 7.5 185.8 196.0 1.66 0.88 35 0.048 222K 8796 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 11.2 181.4 183.8 0.69 0.35 14 0.050 98K 3749 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 0.0 219.3 200.6 1.41 0.76 17 0.083 222K 8796 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 171.9 0.25 0.11 17 0.015 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.042, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.32 GB write, 0.27 MB/s write, 0.30 GB read, 0.26 MB/s read, 1.7 seconds#012Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.12 GB read, 0.21 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5651615b9350#2 capacity: 304.00 MB usage: 26.98 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000195 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1501,25.54 MB,8.40219%) FilterBlock(35,649.30 KB,0.208579%) IndexBlock(35,822.02 KB,0.264062%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 23 05:11:58 localhost sshd[337406]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:11:58 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e272 e272: 6 total, 6 up, 6 in Nov 23 05:11:59 localhost nova_compute[281952]: 2025-11-23 10:11:59.287 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:11:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:00 localhost openstack_network_exporter[242668]: ERROR 10:11:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:00 localhost openstack_network_exporter[242668]: ERROR 10:12:00 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:00 localhost openstack_network_exporter[242668]: ERROR 10:11:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:12:00 localhost openstack_network_exporter[242668]: ERROR 10:12:00 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:12:00 localhost openstack_network_exporter[242668]: Nov 23 05:12:00 localhost nova_compute[281952]: 2025-11-23 10:12:00.024 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:00 localhost openstack_network_exporter[242668]: ERROR 10:12:00 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:12:00 localhost openstack_network_exporter[242668]: Nov 23 05:12:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:12:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:12:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:12:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1566503771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:12:00 localhost ceph-osd[31905]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 19K writes, 73K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 19K writes, 6508 syncs, 2.94 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 11K writes, 43K keys, 11K commit groups, 1.0 writes per commit group, ingest: 32.13 MB, 0.05 MB/s#012Interval WAL: 11K writes, 4855 syncs, 2.45 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 05:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:12:02 localhost podman[337409]: 2025-11-23 10:12:02.333216034 +0000 UTC m=+0.110744208 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:12:02 localhost podman[337409]: 2025-11-23 10:12:02.339814128 +0000 UTC m=+0.117342302 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 23 05:12:02 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:12:02 localhost podman[337434]: 2025-11-23 10:12:02.416359353 +0000 UTC m=+0.075275637 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 23 05:12:02 localhost podman[337408]: 2025-11-23 10:12:02.522567441 +0000 UTC m=+0.303141979 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350) Nov 23 05:12:02 localhost podman[337434]: 2025-11-23 10:12:02.54658741 +0000 UTC m=+0.205503654 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true) Nov 23 05:12:02 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:12:02 localhost podman[337408]: 2025-11-23 10:12:02.566371059 +0000 UTC m=+0.346945627 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container) Nov 23 05:12:02 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:12:03 localhost systemd[1]: tmp-crun.alrday.mount: Deactivated successfully. Nov 23 05:12:03 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 e273: 6 total, 6 up, 6 in Nov 23 05:12:03 localhost ovn_controller[154788]: 2025-11-23T10:12:03Z|00567|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:12:03 localhost nova_compute[281952]: 2025-11-23 10:12:03.994 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 23 05:12:04 localhost ceph-osd[32858]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 27K writes, 102K keys, 27K commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.01 MB/s#012Cumulative WAL: 27K writes, 9964 syncs, 2.79 writes per sync, written: 0.09 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 17K writes, 61K keys, 17K commit groups, 1.0 writes per commit group, ingest: 49.73 MB, 0.08 MB/s#012Interval WAL: 17K writes, 7175 syncs, 2.42 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 23 05:12:05 localhost nova_compute[281952]: 2025-11-23 10:12:05.062 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:07 localhost sshd[337469]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0. Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.570079) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728570119, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1082, "num_deletes": 259, "total_data_size": 2010227, "memory_usage": 2030128, "flush_reason": "Manual Compaction"} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728576976, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1323910, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37126, "largest_seqno": 38203, "table_properties": {"data_size": 1319327, "index_size": 2118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11039, "raw_average_key_size": 20, "raw_value_size": 1309602, "raw_average_value_size": 2385, "num_data_blocks": 93, "num_entries": 549, "num_filter_entries": 549, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892673, "oldest_key_time": 1763892673, "file_creation_time": 1763892728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 6962 microseconds, and 2935 cpu microseconds. Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.577035) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1323910 bytes OK Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.577058) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579003) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579016) EVENT_LOG_v1 {"time_micros": 1763892728579011, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.579035) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 2004734, prev total WAL file size 2005058, number of live WAL files 2. Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.581116) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353239' seq:72057594037927935, type:22 .. '6C6F676D0034373831' seq:0, type:0; will stop at (end) Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1292KB)], [63(16MB)] Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728581149, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19063567, "oldest_snapshot_seqno": -1} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14517 keys, 18926856 bytes, temperature: kUnknown Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728641104, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 18926856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18843182, "index_size": 46261, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 390669, "raw_average_key_size": 26, "raw_value_size": 18595694, "raw_average_value_size": 1280, "num_data_blocks": 1718, "num_entries": 14517, "num_filter_entries": 14517, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892728, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.641531) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 18926856 bytes Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.643206) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 317.1 rd, 314.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 16.9 +0.0 blob) out(18.1 +0.0 blob), read-write-amplify(28.7) write-amplify(14.3) OK, records in: 15057, records dropped: 540 output_compression: NoCompression Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.643234) EVENT_LOG_v1 {"time_micros": 1763892728643220, "job": 38, "event": "compaction_finished", "compaction_time_micros": 60117, "compaction_time_cpu_micros": 28094, "output_level": 6, "num_output_files": 1, "total_output_size": 18926856, "num_input_records": 15057, "num_output_records": 14517, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728643664, "job": 38, "event": "table_file_deletion", "file_number": 65} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892728646056, "job": 38, "event": "table_file_deletion", "file_number": 63} Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.581065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646178) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646180) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:08 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:08.646189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:09 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:09.023 263258 INFO neutron.agent.linux.ip_lib [None req-1bad5901-8116-484e-9c67-16d4f9024020 - - - - - -] Device tap8d34d210-e3 cannot be used as it has no MAC address#033[00m Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.098 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost kernel: device tap8d34d210-e3 entered promiscuous mode Nov 23 05:12:09 localhost NetworkManager[5975]: [1763892729.1067] manager: (tap8d34d210-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/89) Nov 23 05:12:09 localhost ovn_controller[154788]: 2025-11-23T10:12:09Z|00568|binding|INFO|Claiming lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f for this chassis. Nov 23 05:12:09 localhost ovn_controller[154788]: 2025-11-23T10:12:09Z|00569|binding|INFO|8d34d210-e3a3-4e23-87f0-ec4920bc623f: Claiming unknown Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.107 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost systemd-udevd[337481]: Network interface NamePolicy= disabled on kernel command line. Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.117 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89472f96637447cc97b8792018a15a8e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=883be6eb-f071-49b9-9f06-2b23859983f7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d34d210-e3a3-4e23-87f0-ec4920bc623f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.119 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8d34d210-e3a3-4e23-87f0-ec4920bc623f in datapath 2b870224-659d-44c3-8ebb-5c0e146f53c2 bound to our chassis#033[00m Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.121 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3a9e6b95-b883-4481-9609-ef21812a17a9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.121 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b870224-659d-44c3-8ebb-5c0e146f53c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.122 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[cf09fbf0-4876-4c24-8854-6e8fb5511abd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.140 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.145 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:12:09 localhost ovn_controller[154788]: 2025-11-23T10:12:09Z|00570|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f ovn-installed in OVS Nov 23 05:12:09 localhost ovn_controller[154788]: 2025-11-23T10:12:09Z|00571|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f up in Southbound Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost journal[230249]: ethtool ioctl error on tap8d34d210-e3: No such device Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.193 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost nova_compute[281952]: 2025-11-23 10:12:09.233 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:09 localhost podman[337489]: 2025-11-23 10:12:09.255808278 +0000 UTC m=+0.093359513 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:12:09 localhost podman[337489]: 2025-11-23 10:12:09.294413207 +0000 UTC m=+0.131964412 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.313 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:12:09 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:12:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:09.313 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:12:09 localhost podman[337530]: 2025-11-23 10:12:09.367082603 +0000 UTC m=+0.089273128 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd) Nov 23 05:12:09 localhost podman[337530]: 2025-11-23 10:12:09.409612372 +0000 UTC m=+0.131802897 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 23 05:12:09 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:12:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:10 localhost nova_compute[281952]: 2025-11-23 10:12:10.065 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:10 localhost podman[337593]: Nov 23 05:12:10 localhost podman[337593]: 2025-11-23 10:12:10.90922051 +0000 UTC m=+0.093863075 container create 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 23 05:12:10 localhost systemd[1]: Started libpod-conmon-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope. Nov 23 05:12:10 localhost podman[337593]: 2025-11-23 10:12:10.863561627 +0000 UTC m=+0.048204202 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 23 05:12:10 localhost systemd[1]: tmp-crun.reHY9I.mount: Deactivated successfully. Nov 23 05:12:10 localhost systemd[1]: Started libcrun container. Nov 23 05:12:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7a0f535b2a4bac0524376de77e2e7218bc8f8de445e4ca8e5fde600221f23be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 23 05:12:10 localhost podman[337593]: 2025-11-23 10:12:10.989646071 +0000 UTC m=+0.174288636 container init 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:12:11 localhost podman[337593]: 2025-11-23 10:12:11.000603767 +0000 UTC m=+0.185246352 container start 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:12:11 localhost dnsmasq[337611]: started, version 2.85 cachesize 150 Nov 23 05:12:11 localhost dnsmasq[337611]: DNS service limited to local subnets Nov 23 05:12:11 localhost dnsmasq[337611]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 23 05:12:11 localhost dnsmasq[337611]: warning: no upstream servers configured Nov 23 05:12:11 localhost dnsmasq-dhcp[337611]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 23 05:12:11 localhost dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 0 addresses Nov 23 05:12:11 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host Nov 23 05:12:11 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts Nov 23 05:12:11 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:11.514 263258 INFO neutron.agent.dhcp.agent [None req-71487d0d-4203-4cfe-a9e9-63f9007780a9 - - - - - -] DHCP configuration for ports {'9e43af07-4a3b-4e51-b7a4-61386f16209f'} is completed#033[00m Nov 23 05:12:11 localhost nova_compute[281952]: 2025-11-23 10:12:11.707 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:11 localhost podman[240668]: time="2025-11-23T10:12:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:12:11 localhost podman[240668]: @ - - [23/Nov/2025:10:12:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155686 "" "Go-http-client/1.1" Nov 23 05:12:11 localhost podman[240668]: @ - - [23/Nov/2025:10:12:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19261 "" "Go-http-client/1.1" Nov 23 05:12:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:13.235 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:12:12Z, description=, device_id=53863d0f-d6fc-41eb-8fc3-62cf12ff3867, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c947339a-0011-4964-8e3a-0ace7b6f4440, ip_allocation=immediate, mac_address=fa:16:3e:56:43:3e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:12:04Z, description=, dns_domain=, id=2b870224-659d-44c3-8ebb-5c0e146f53c2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-505464635-network, port_security_enabled=True, project_id=89472f96637447cc97b8792018a15a8e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56028, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3936, status=ACTIVE, subnets=['29288101-6a0e-412a-aadc-f80ef429088f'], tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:05Z, vlan_transparent=None, network_id=2b870224-659d-44c3-8ebb-5c0e146f53c2, port_security_enabled=False, project_id=89472f96637447cc97b8792018a15a8e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3944, status=DOWN, tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:13Z on network 2b870224-659d-44c3-8ebb-5c0e146f53c2#033[00m Nov 23 05:12:13 localhost podman[337629]: 2025-11-23 10:12:13.430937721 +0000 UTC m=+0.044694934 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:12:13 localhost dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 1 addresses Nov 23 05:12:13 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host Nov 23 05:12:13 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts Nov 23 05:12:13 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:13.737 263258 INFO neutron.agent.dhcp.agent [None req-323831dd-6992-41fc-b259-1e1a149a15e9 - - - - - -] DHCP configuration for ports {'c947339a-0011-4964-8e3a-0ace7b6f4440'} is completed#033[00m Nov 23 05:12:14 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:14.429 263258 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-23T10:12:12Z, description=, device_id=53863d0f-d6fc-41eb-8fc3-62cf12ff3867, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c947339a-0011-4964-8e3a-0ace7b6f4440, ip_allocation=immediate, mac_address=fa:16:3e:56:43:3e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-23T10:12:04Z, description=, dns_domain=, id=2b870224-659d-44c3-8ebb-5c0e146f53c2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-505464635-network, port_security_enabled=True, project_id=89472f96637447cc97b8792018a15a8e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56028, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3936, status=ACTIVE, subnets=['29288101-6a0e-412a-aadc-f80ef429088f'], tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:05Z, vlan_transparent=None, network_id=2b870224-659d-44c3-8ebb-5c0e146f53c2, port_security_enabled=False, project_id=89472f96637447cc97b8792018a15a8e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3944, status=DOWN, tags=[], tenant_id=89472f96637447cc97b8792018a15a8e, updated_at=2025-11-23T10:12:13Z on network 2b870224-659d-44c3-8ebb-5c0e146f53c2#033[00m Nov 23 05:12:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:14.555 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:9b:ed', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:46:67:49:71:9c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:12:14 localhost nova_compute[281952]: 2025-11-23 10:12:14.555 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:14 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:14.557 160439 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 23 05:12:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:14 localhost dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 1 addresses Nov 23 05:12:14 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host Nov 23 05:12:14 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts Nov 23 05:12:14 localhost podman[337667]: 2025-11-23 10:12:14.65657496 +0000 UTC m=+0.070507837 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 23 05:12:15 localhost nova_compute[281952]: 2025-11-23 10:12:15.069 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:15 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:15.429 263258 INFO neutron.agent.dhcp.agent [None req-6829f279-bb87-49d8-9ad1-6b6cb80048d9 - - - - - -] DHCP configuration for ports {'c947339a-0011-4964-8e3a-0ace7b6f4440'} is completed#033[00m Nov 23 05:12:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:20 localhost nova_compute[281952]: 2025-11-23 10:12:20.073 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:12:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:12:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:12:22 localhost podman[337687]: 2025-11-23 10:12:22.017254695 +0000 UTC m=+0.072679905 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:12:22 localhost podman[337687]: 2025-11-23 10:12:22.028385017 +0000 UTC m=+0.083810237 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:12:22 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:12:22 localhost podman[337688]: 2025-11-23 10:12:22.080611201 +0000 UTC m=+0.132155732 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 23 05:12:22 localhost podman[337688]: 2025-11-23 10:12:22.11932903 +0000 UTC m=+0.170873551 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 23 05:12:22 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:12:23 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:23.558 160439 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=26f986a7-6ac7-4ec2-887b-8da6da04a661, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.427 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:12:24 localhost nova_compute[281952]: 2025-11-23 10:12:24.428 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:12:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.075 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.077 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.078 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.763 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.770 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.780 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:12:25 localhost nova_compute[281952]: 2025-11-23 10:12:25.780 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:12:27 localhost nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:27 localhost nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:27 localhost nova_compute[281952]: 2025-11-23 10:12:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:27 localhost nova_compute[281952]: 2025-11-23 10:12:27.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:12:28 localhost nova_compute[281952]: 2025-11-23 10:12:28.209 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:28 localhost nova_compute[281952]: 2025-11-23 10:12:28.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.280 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.281 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.281 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:12:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:12:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/193271679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.736 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.811 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:12:29 localhost nova_compute[281952]: 2025-11-23 10:12:29.811 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:12:29 localhost openstack_network_exporter[242668]: ERROR 10:12:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:29 localhost openstack_network_exporter[242668]: ERROR 10:12:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:12:29 localhost openstack_network_exporter[242668]: ERROR 10:12:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:29 localhost openstack_network_exporter[242668]: ERROR 10:12:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:12:29 localhost openstack_network_exporter[242668]: Nov 23 05:12:29 localhost openstack_network_exporter[242668]: ERROR 10:12:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:12:29 localhost openstack_network_exporter[242668]: Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.024 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11034MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.026 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.080 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.109 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.109 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.110 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.172 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:12:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:12:30 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2916345404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.615 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.620 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.640 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.642 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:12:30 localhost nova_compute[281952]: 2025-11-23 10:12:30.642 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:12:31 localhost nova_compute[281952]: 2025-11-23 10:12:31.643 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:31 localhost nova_compute[281952]: 2025-11-23 10:12:31.644 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:12:32 localhost nova_compute[281952]: 2025-11-23 10:12:32.499 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:12:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:12:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:12:33 localhost podman[337775]: 2025-11-23 10:12:33.039807513 +0000 UTC m=+0.086154037 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 23 05:12:33 localhost podman[337775]: 2025-11-23 10:12:33.051428141 +0000 UTC m=+0.097774635 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 23 05:12:33 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:12:33 localhost podman[337776]: 2025-11-23 10:12:33.102589803 +0000 UTC m=+0.141727006 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container) Nov 23 05:12:33 localhost podman[337776]: 2025-11-23 10:12:33.146355047 +0000 UTC m=+0.185492180 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 23 05:12:33 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:12:33 localhost podman[337774]: 2025-11-23 10:12:33.147718059 +0000 UTC m=+0.195574820 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 23 05:12:33 localhost podman[337774]: 2025-11-23 10:12:33.227745539 +0000 UTC m=+0.275602320 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:12:33 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:12:34 localhost dnsmasq[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/addn_hosts - 0 addresses Nov 23 05:12:34 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/host Nov 23 05:12:34 localhost podman[337848]: 2025-11-23 10:12:34.354153769 +0000 UTC m=+0.062941516 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:12:34 localhost dnsmasq-dhcp[337611]: read /var/lib/neutron/dhcp/2b870224-659d-44c3-8ebb-5c0e146f53c2/opts Nov 23 05:12:34 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:34 localhost ovn_controller[154788]: 2025-11-23T10:12:34Z|00572|binding|INFO|Releasing lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f from this chassis (sb_readonly=0) Nov 23 05:12:34 localhost ovn_controller[154788]: 2025-11-23T10:12:34Z|00573|binding|INFO|Setting lport 8d34d210-e3a3-4e23-87f0-ec4920bc623f down in Southbound Nov 23 05:12:34 localhost nova_compute[281952]: 2025-11-23 10:12:34.591 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:34 localhost kernel: device tap8d34d210-e3 left promiscuous mode Nov 23 05:12:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:34.602 160439 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005532585.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpcc61779b-abde-5d05-ae97-b9e7239fb895-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b870224-659d-44c3-8ebb-5c0e146f53c2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89472f96637447cc97b8792018a15a8e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005532585.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=883be6eb-f071-49b9-9f06-2b23859983f7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d34d210-e3a3-4e23-87f0-ec4920bc623f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 23 05:12:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:34.605 160439 INFO neutron.agent.ovn.metadata.agent [-] Port 8d34d210-e3a3-4e23-87f0-ec4920bc623f in datapath 2b870224-659d-44c3-8ebb-5c0e146f53c2 unbound from our chassis#033[00m Nov 23 05:12:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:34.607 160439 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b870224-659d-44c3-8ebb-5c0e146f53c2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 23 05:12:34 localhost ovn_metadata_agent[160434]: 2025-11-23 10:12:34.608 160542 DEBUG oslo.privsep.daemon [-] privsep: reply[af690cd4-66dd-4b29-9eb3-768ec25b0eb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 23 05:12:34 localhost nova_compute[281952]: 2025-11-23 10:12:34.614 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:35 localhost nova_compute[281952]: 2025-11-23 10:12:35.081 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:35 localhost sshd[337871]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:12:37 localhost ovn_controller[154788]: 2025-11-23T10:12:37Z|00574|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:12:37 localhost nova_compute[281952]: 2025-11-23 10:12:37.515 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:38 localhost systemd[1]: tmp-crun.iOVqs8.mount: Deactivated successfully. Nov 23 05:12:38 localhost dnsmasq[337611]: exiting on receipt of SIGTERM Nov 23 05:12:38 localhost podman[337888]: 2025-11-23 10:12:38.707003944 +0000 UTC m=+0.062566563 container kill 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118) Nov 23 05:12:38 localhost systemd[1]: libpod-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope: Deactivated successfully. Nov 23 05:12:38 localhost podman[337901]: 2025-11-23 10:12:38.757354651 +0000 UTC m=+0.041607858 container died 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true) Nov 23 05:12:38 localhost podman[337901]: 2025-11-23 10:12:38.791631695 +0000 UTC m=+0.075884862 container cleanup 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 23 05:12:38 localhost systemd[1]: libpod-conmon-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830.scope: Deactivated successfully. Nov 23 05:12:38 localhost podman[337909]: 2025-11-23 10:12:38.871937132 +0000 UTC m=+0.137187216 container remove 5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2b870224-659d-44c3-8ebb-5c0e146f53c2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 23 05:12:39 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:39.499 263258 INFO neutron.agent.dhcp.agent [None req-2a9705fa-0663-4828-9ab1-e1446c7d0128 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:12:39 localhost neutron_dhcp_agent[263254]: 2025-11-23 10:12:39.507 263258 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 23 05:12:39 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:12:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:12:39 localhost systemd[1]: tmp-crun.NilISF.mount: Deactivated successfully. Nov 23 05:12:39 localhost systemd[1]: var-lib-containers-storage-overlay-d7a0f535b2a4bac0524376de77e2e7218bc8f8de445e4ca8e5fde600221f23be-merged.mount: Deactivated successfully. Nov 23 05:12:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f29186bd71f5d39f22ae1f4ce18e8a920c10a4c9d0b9eb5c15da9ded1a9d830-userdata-shm.mount: Deactivated successfully. Nov 23 05:12:39 localhost systemd[1]: run-netns-qdhcp\x2d2b870224\x2d659d\x2d44c3\x2d8ebb\x2d5c0e146f53c2.mount: Deactivated successfully. Nov 23 05:12:39 localhost systemd[1]: tmp-crun.4CHOV2.mount: Deactivated successfully. Nov 23 05:12:39 localhost podman[337932]: 2025-11-23 10:12:39.783440899 +0000 UTC m=+0.089348176 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 05:12:39 localhost podman[337932]: 2025-11-23 10:12:39.819747585 +0000 UTC m=+0.125654882 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Nov 23 05:12:39 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:12:39 localhost podman[337933]: 2025-11-23 10:12:39.824283753 +0000 UTC m=+0.127518038 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:12:39 localhost podman[337933]: 2025-11-23 10:12:39.904296932 +0000 UTC m=+0.207531297 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:12:39 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:12:40 localhost nova_compute[281952]: 2025-11-23 10:12:40.084 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:40 localhost nova_compute[281952]: 2025-11-23 10:12:40.090 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0. Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.773623) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761773659, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 619, "num_deletes": 251, "total_data_size": 504977, "memory_usage": 516056, "flush_reason": "Manual Compaction"} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761778295, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 328382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38208, "largest_seqno": 38822, "table_properties": {"data_size": 325483, "index_size": 882, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7217, "raw_average_key_size": 19, "raw_value_size": 319655, "raw_average_value_size": 875, "num_data_blocks": 39, "num_entries": 365, "num_filter_entries": 365, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763892728, "oldest_key_time": 1763892728, "file_creation_time": 1763892761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4723 microseconds, and 1705 cpu microseconds. Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.778342) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 328382 bytes OK Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.778367) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780089) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780109) EVENT_LOG_v1 {"time_micros": 1763892761780103, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780128) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 501509, prev total WAL file size 501509, number of live WAL files 2. Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780669) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end) Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(320KB)], [66(18MB)] Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761780740, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 19255238, "oldest_snapshot_seqno": -1} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14367 keys, 17820859 bytes, temperature: kUnknown Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761866223, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 17820859, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17739923, "index_size": 43907, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 388018, "raw_average_key_size": 27, "raw_value_size": 17496724, "raw_average_value_size": 1217, "num_data_blocks": 1615, "num_entries": 14367, "num_filter_entries": 14367, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763891517, "oldest_key_time": 0, "file_creation_time": 1763892761, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4d2c9233-e977-47c6-b4f9-0c301abf625f", "db_session_id": "R30MDH64VRAWCJ1C6PRG", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.866501) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 17820859 bytes Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.868105) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.0 rd, 208.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.1 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(112.9) write-amplify(54.3) OK, records in: 14882, records dropped: 515 output_compression: NoCompression Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.868121) EVENT_LOG_v1 {"time_micros": 1763892761868114, "job": 40, "event": "compaction_finished", "compaction_time_micros": 85579, "compaction_time_cpu_micros": 55352, "output_level": 6, "num_output_files": 1, "total_output_size": 17820859, "num_input_records": 14882, "num_output_records": 14367, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761868252, "job": 40, "event": "table_file_deletion", "file_number": 68} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005532585/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763892761869680, "job": 40, "event": "table_file_deletion", "file_number": 66} Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.780556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost ceph-mon[300199]: rocksdb: (Original Log Time 2025/11/23-10:12:41.870034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 23 05:12:41 localhost podman[240668]: time="2025-11-23T10:12:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:12:41 localhost podman[240668]: @ - - [23/Nov/2025:10:12:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:12:41 localhost podman[240668]: @ - - [23/Nov/2025:10:12:41 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18789 "" "Go-http-client/1.1" Nov 23 05:12:43 localhost ovn_controller[154788]: 2025-11-23T10:12:43Z|00575|binding|INFO|Releasing lport 98ef2da5-f5cb-44e8-a4b2-f6178c6c8332 from this chassis (sb_readonly=0) Nov 23 05:12:43 localhost nova_compute[281952]: 2025-11-23 10:12:43.240 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:44 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:45 localhost nova_compute[281952]: 2025-11-23 10:12:45.088 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:45 localhost nova_compute[281952]: 2025-11-23 10:12:45.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:49 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:50 localhost nova_compute[281952]: 2025-11-23 10:12:50.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:12:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:12:53 localhost podman[337975]: 2025-11-23 10:12:53.014475026 +0000 UTC m=+0.071019113 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 23 05:12:53 localhost podman[337975]: 2025-11-23 10:12:53.050248605 +0000 UTC m=+0.106792712 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 23 05:12:53 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:12:53 localhost systemd[1]: tmp-crun.0kBbEF.mount: Deactivated successfully. Nov 23 05:12:53 localhost podman[337976]: 2025-11-23 10:12:53.114571951 +0000 UTC m=+0.157561162 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 23 05:12:53 localhost podman[337976]: 2025-11-23 10:12:53.153293111 +0000 UTC m=+0.196282312 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team) Nov 23 05:12:53 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:12:54 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 23 05:12:54 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:12:55 localhost nova_compute[281952]: 2025-11-23 10:12:55.092 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:55 localhost nova_compute[281952]: 2025-11-23 10:12:55.097 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:12:58 localhost ceph-mon[300199]: from='mgr.44369 172.18.0.106:0/4210916137' entity='mgr.np0005532584.naxwxy' Nov 23 05:12:59 localhost sshd[338104]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:12:59 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:12:59 localhost openstack_network_exporter[242668]: ERROR 10:12:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:59 localhost openstack_network_exporter[242668]: ERROR 10:12:59 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:12:59 localhost openstack_network_exporter[242668]: ERROR 10:12:59 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:12:59 localhost openstack_network_exporter[242668]: ERROR 10:12:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:12:59 localhost openstack_network_exporter[242668]: Nov 23 05:12:59 localhost openstack_network_exporter[242668]: ERROR 10:12:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:12:59 localhost openstack_network_exporter[242668]: Nov 23 05:13:00 localhost nova_compute[281952]: 2025-11-23 10:13:00.095 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 23 05:13:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 23 05:13:00 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 23 05:13:00 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4154043123' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 23 05:13:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543. Nov 23 05:13:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e. Nov 23 05:13:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9. Nov 23 05:13:04 localhost podman[338107]: 2025-11-23 10:13:04.03557689 +0000 UTC m=+0.088052617 container health_status 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3) Nov 23 05:13:04 localhost podman[338107]: 2025-11-23 10:13:04.044255096 +0000 UTC m=+0.096730763 container exec_died 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 23 05:13:04 localhost systemd[1]: 9fa4e11ece987b9ed42ff125983a4661776c92d2b2b2d7e0c90098beb4dc6c2e.service: Deactivated successfully. Nov 23 05:13:04 localhost podman[338108]: 2025-11-23 10:13:04.094302574 +0000 UTC m=+0.143939844 container health_status ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.) Nov 23 05:13:04 localhost podman[338108]: 2025-11-23 10:13:04.134310083 +0000 UTC m=+0.183947363 container exec_died ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9) Nov 23 05:13:04 localhost systemd[1]: ee2ff416ac3f408b8d77f36c3ec410e79b097b5c9271cb28f157a7b91961f7c9.service: Deactivated successfully. Nov 23 05:13:04 localhost podman[338106]: 2025-11-23 10:13:04.189077237 +0000 UTC m=+0.241346108 container health_status 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:13:04 localhost podman[338106]: 2025-11-23 10:13:04.266420062 +0000 UTC m=+0.318688973 container exec_died 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76) Nov 23 05:13:04 localhost systemd[1]: 2fa28c94b5d90f2ad930a3fb323bea0693100f3f601c1c391d13cacfdd165543.service: Deactivated successfully. Nov 23 05:13:04 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:05 localhost nova_compute[281952]: 2025-11-23 10:13:05.099 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:13:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:13:09.309 160439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:13:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:13:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:13:09 localhost ovn_metadata_agent[160434]: 2025-11-23 10:13:09.310 160439 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:13:09 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8. Nov 23 05:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9. Nov 23 05:13:10 localhost podman[338165]: 2025-11-23 10:13:10.018539591 +0000 UTC m=+0.075712477 container health_status 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 23 05:13:10 localhost podman[338165]: 2025-11-23 10:13:10.031282472 +0000 UTC m=+0.088455328 container exec_died 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 23 05:13:10 localhost systemd[1]: 072936861c953f3ec51e75befbee624912d21b0649363f876a548b841c6456a8.service: Deactivated successfully. Nov 23 05:13:10 localhost podman[338166]: 2025-11-23 10:13:10.080138284 +0000 UTC m=+0.128962704 container health_status a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 23 05:13:10 localhost nova_compute[281952]: 2025-11-23 10:13:10.101 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:13:10 localhost podman[338166]: 2025-11-23 10:13:10.115875952 +0000 UTC m=+0.164700402 container exec_died a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 23 05:13:10 localhost systemd[1]: a8da4ee5e2cbb05f8c8d32399c20d25f74bda23c40816401e76366c3820d18a9.service: Deactivated successfully. Nov 23 05:13:10 localhost sshd[338208]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.811 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'name': 'test', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005532585.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1915d3e5d4254231a0517e2dcf35848f', 'user_id': '7e40ee99e6034be7be796ae12095c154', 'hostId': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.813 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.816 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f2852cb-16a2-4951-a51a-388c8dbdbed8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.813249', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b0bd7e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'e2e72c071e9b4478ec86af9c9ca662c983fd44e363bd783580ec8e6816b3b2ae'}]}, 'timestamp': '2025-11-23 10:13:10.817829', '_unique_id': '9b8727b8e87e49678ec2aa1ae19cb03f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.819 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.845 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.846 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc32106a-d93c-4c09-9ad0-ee96cd32d224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.820930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b5116c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '1493a77b848df57fd843a6bac2e6d212452f1987ec11df29158b61d69045ece6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.820930', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b525bc-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': 'f0cebeede2ee626fdcc1a574f85bdfbb64ee7ca719e3899462e5f050b29067c1'}]}, 'timestamp': '2025-11-23 10:13:10.846621', '_unique_id': '05ca018c295544a2bc057e373ac91aff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.847 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.849 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.849 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41728c8-9cd7-417b-aa7a-7b8b19cc09c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.849194', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b59c2c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '8d325b9b0ab971ec3d80cb4b3363bc71330a840a560303b881bb057631525ce8'}]}, 'timestamp': '2025-11-23 10:13:10.849682', '_unique_id': '67f23e9e503e4148bdb6f75d41f4fe29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.850 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.852 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ccee0b8-f5a0-41d4-9967-b6dd6171f135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.852039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b60d38-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '3dd6d96170460b0f4d6a02d90237980afee87d8d76b47744f5e31e63b941232d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.852039', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b61dd2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '40f0e26e7bfb568d265029862c0dfc10818c8b467e04f88a2a20617da42cf94d'}]}, 'timestamp': '2025-11-23 10:13:10.852998', '_unique_id': '477a0f518fee4c738d563b2fb6567c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.853 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.855 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.864 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.865 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee7c670f-5a93-44f6-9b8d-0334ed526499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.855341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b80318-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '3cab1ff58f28f8be21fcd50544753e765b102856d3d49f56891eca5d2f17127e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.855341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b813b2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'fba3045610c52c5749ea042203d29a02ed8a6a906defd19521cf5c258ffde47a'}]}, 'timestamp': '2025-11-23 10:13:10.865808', '_unique_id': '1b513730c83f45ee9ddddc520cbc19d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.866 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.867 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.868 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9364228-93d6-4fc9-a564-9fd016165aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.868060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b87c3a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'd37f554c39c43ff75c32e13eea755df7841af36bb9e59e5df7f9f7ce84455b65'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.868060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b88c2a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '2a8caafef0d2b81e1364a0468e06a9e796981c36a58df5550e6a0c3f26407e32'}]}, 'timestamp': '2025-11-23 10:13:10.868919', '_unique_id': '0ec87e0bd42048458c12d31215a3f427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.869 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.871 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes volume: 7110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e8666b5-fe6e-4c3d-83ef-9d8cbee507ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7110, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.871521', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b903b2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'd785bbd63d44c369d7b3aef3185964ce799dfe3188802ef222e7b9ec523f338c'}]}, 'timestamp': '2025-11-23 10:13:10.872027', '_unique_id': '4f99a06ecb4949828e309d32e371bdcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.872 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.873 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.874 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49e81172-dc73-4e5e-ada8-40952647de35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.874116', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03b968c0-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '2299edc91546bcd214850eddd47e6e1f6e7292116a0d2bd178eedce6883e6698'}]}, 'timestamp': '2025-11-23 10:13:10.874591', '_unique_id': '7570ecea5936404f945148994f20a86d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.875 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.876 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.876 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.877 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d70222e-84e6-4e45-9153-b331d7337bfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.876739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03b9d026-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '56ece8da771856f5ef37f9e5792f38df3230953be0e986b81a1866d143112f69'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.876739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03b9e066-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '0b02d004e5933e1f475c222f2f058aa3a9c7a6c4f7fca1d32cbad96d581ad457'}]}, 'timestamp': '2025-11-23 10:13:10.877602', '_unique_id': 'efbfa587e5d5414c9c886818432886b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.878 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.879 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc1c1913-41d1-472e-af05-f9bed1eade17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.879923', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03ba4c40-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'ab84943e206a5b2ab59da5c952146ec13a276acffaf2db05b0bf48b3c202b487'}]}, 'timestamp': '2025-11-23 10:13:10.880392', '_unique_id': 'a2c76ab8cd634675bc515b50784b1c03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.881 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.882 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.895 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/memory.usage volume: 51.6171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b4bcde5-6da5-4099-b08c-be3fd8797eac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.6171875, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:13:10.882562', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '03bcaaf8-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.072966644, 'message_signature': 'd80fe97e419d522e2c7feed5f6dfce2e1ad4475e2d297768c15dd6b7fdb0a959'}]}, 'timestamp': '2025-11-23 10:13:10.895951', '_unique_id': '304ec9640bf44eaf87bfdd6176f0a792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.896 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.897 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.898 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/cpu volume: 20430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e217b216-b5ec-4cee-92b8-f208cec992bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20430000000, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'timestamp': '2025-11-23T10:13:10.898065', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '03bd101a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.072966644, 'message_signature': '068863c6096cbcac171ad4c469d3b3a8759fbaf820a58061ffc9eabd6ff2796c'}]}, 'timestamp': '2025-11-23 10:13:10.898531', '_unique_id': '91592f9c1ec542b49d30b566a284aaab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.899 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.900 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.900 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.901 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e65cae1-316f-4b26-93f2-e2b596a82430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.900618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03bd73b6-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': 'c99bdf5105ad1426c94fac4543fa55e83cb9e002dde53eb9946601e2b83b998e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.900618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bd855e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '1daf439033514dce3e82f6e3844c2906ab106776f84a78c7dcbcb3c68f468f86'}]}, 'timestamp': '2025-11-23 10:13:10.901484', '_unique_id': '27e7bfa881e1443b96fa8f634e73e913'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.902 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.903 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.903 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df9664a-109c-42bc-8105-6a7872d1b1f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.903867', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03bdf8d6-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '5a02f3ec0940ccb41f409b926df14745fc47b7b58412143abb3ce7ce236bc03d'}]}, 'timestamp': '2025-11-23 10:13:10.904479', '_unique_id': '7fff7227dd254f22aaeb847bc98867aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.905 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.907 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.908 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce11305-f31d-4b03-89a8-e5768ec6908c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.907795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03be926e-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': 'bbe322cc50fe5a94422de4dffabadba5c3281fd092ed972702762609c9031fe2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.907795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bea538-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12734.032978174, 'message_signature': '105340986568af5c4db720a7795021c811da40f8538f67064b57e3cacf725ab4'}]}, 'timestamp': '2025-11-23 10:13:10.908872', '_unique_id': '67b1c0bb431746229e70f45a78b17d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.910 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.912 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets volume: 63 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc04e770-01da-4be0-a222-cc22ae85783d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 63, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.912023', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03bf3480-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': '737df3749b523310e4e294865e515f29949c868205719f3fd2dfbef1e91388b5'}]}, 'timestamp': '2025-11-23 10:13:10.912608', '_unique_id': '751f9f3e8f574c3e85838b7392ce17e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.913 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.915 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 1223162892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.916 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.write.latency volume: 24987054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51f1205-411d-4d98-b8d2-4406aabbbe1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1223162892, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.915248', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03bfbdc4-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '94fda37ee7fe8633681ba681fd6113543dd3a632b5c8a187b2335e3e5d3ec380'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24987054, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.915248', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03bfd32c-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '3fb8846b3b1ef1392379be870acb8c37b726a1c70e06f57542044990c5d273e9'}]}, 'timestamp': '2025-11-23 10:13:10.916605', '_unique_id': '33233522b5b0429383631fe0c6650432'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.917 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 1745186404 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.919 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/disk.device.read.latency volume: 98654255 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6bf948-c281-4fa1-94f0-1bb314565d2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1745186404, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vda', 'timestamp': '2025-11-23T10:13:10.919363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '03c052a2-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '2bad0a4f9b4b6325a4a87175f133be297cc6e82f17fe2c6e2adff70a6c6ffba9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 98654255, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': '355032bc-9946-4f6d-817c-2bfc8694d41d-vdb', 'timestamp': '2025-11-23T10:13:10.919363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '03c0662a-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.998579918, 'message_signature': '020fd069616bfd3897b0bc96d429690f5950996dbd1c6142cc8e58c6220a518b'}]}, 'timestamp': '2025-11-23 10:13:10.920356', '_unique_id': 'a8e7980013d440518f2844f5ee4e6a50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.921 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.923 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.packets volume: 118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26ab5a97-857e-45df-9fe8-c1bce67c7c95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 118, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.923269', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c0eafa-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'c87a907e733f76a5d520f6b52bf35b5b08ff9924108d25d31a005ce40e8effa5'}]}, 'timestamp': '2025-11-23 10:13:10.923877', '_unique_id': 'bc2b428e7668490c84b7749914cbe70b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.924 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.926 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.outgoing.bytes volume: 10162 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2faf7e54-a318-4ee2-ba48-ea31c366fa68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10162, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.926356', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c16430-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'a4ac80c3e9dadc4f045b50dc7ed24fd66f181456d7cd29d13f624cb63cd39db8'}]}, 'timestamp': '2025-11-23 10:13:10.926967', '_unique_id': '53de440ef8264a00a1572dc14d65dbe6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.928 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.929 12 DEBUG ceilometer.compute.pollsters [-] 355032bc-9946-4f6d-817c-2bfc8694d41d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f3216ba-9b7d-4784-aefe-d287bfb576f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7e40ee99e6034be7be796ae12095c154', 'user_name': None, 'project_id': '1915d3e5d4254231a0517e2dcf35848f', 'project_name': None, 'resource_id': 'instance-00000002-355032bc-9946-4f6d-817c-2bfc8694d41d-tapd3912d14-a3', 'timestamp': '2025-11-23T10:13:10.929368', 'resource_metadata': {'display_name': 'test', 'name': 'tapd3912d14-a3', 'instance_id': '355032bc-9946-4f6d-817c-2bfc8694d41d', 'instance_type': 'm1.small', 'host': '9afad8a67f9c53a8a8d9386b617ec171c27e0704ebfa0d16a3f6b0a5', 'instance_host': 'np0005532585.localdomain', 'flavor': {'id': '8c32de12-b44b-4285-8afc-2a1d7f236d32', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284'}, 'image_ref': 'be9a09b1-b916-4d06-9bcd-d8b8afdf9284', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:cf:aa:3b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd3912d14-a3'}, 'message_id': '03c1d8de-c855-11f0-bde4-fa163e72a351', 'monotonic_time': 12733.990884551, 'message_signature': 'bb800bd3dc50a7e0a255040f46ef3e07735794e37b30c65a431f0dcb64f30189'}]}, 'timestamp': '2025-11-23 10:13:10.929952', '_unique_id': '82149093d79d4c09aad14a22bcd0d8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging yield Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 23 05:13:10 localhost ceilometer_agent_compute[238018]: 2025-11-23 10:13:10.931 12 ERROR oslo_messaging.notify.messaging Nov 23 05:13:11 localhost podman[240668]: time="2025-11-23T10:13:11Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 23 05:13:11 localhost podman[240668]: @ - - [23/Nov/2025:10:13:11 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 23 05:13:11 localhost podman[240668]: @ - - [23/Nov/2025:10:13:11 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18785 "" "Go-http-client/1.1" Nov 23 05:13:14 localhost sshd[338210]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:13:14 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:14 localhost systemd-logind[761]: New session 74 of user zuul. Nov 23 05:13:14 localhost systemd[1]: Started Session 74 of User zuul. Nov 23 05:13:14 localhost ovn_controller[154788]: 2025-11-23T10:13:14Z|00576|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory Nov 23 05:13:14 localhost python3[338232]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-fc5a-8bfb-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 23 05:13:15 localhost nova_compute[281952]: 2025-11-23 10:13:15.104 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:19 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.106 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.108 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.109 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 23 05:13:20 localhost nova_compute[281952]: 2025-11-23 10:13:20.112 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:20 localhost systemd[1]: session-74.scope: Deactivated successfully. Nov 23 05:13:20 localhost systemd-logind[761]: Session 74 logged out. Waiting for processes to exit. Nov 23 05:13:20 localhost systemd-logind[761]: Removed session 74. Nov 23 05:13:22 localhost sshd[338235]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e. Nov 23 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44. Nov 23 05:13:23 localhost podman[338237]: 2025-11-23 10:13:23.573539191 +0000 UTC m=+0.068708302 container health_status 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 23 05:13:23 localhost podman[338237]: 2025-11-23 10:13:23.582745343 +0000 UTC m=+0.077914514 container exec_died 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 23 05:13:23 localhost systemd[1]: 1dd1f6ad60f58bfac54e07f5576ac802e780baea5eae8808729613b2fac28d0e.service: Deactivated successfully. Nov 23 05:13:23 localhost systemd[1]: tmp-crun.NEN8OT.mount: Deactivated successfully. Nov 23 05:13:23 localhost podman[338238]: 2025-11-23 10:13:23.650985671 +0000 UTC m=+0.143712737 container health_status db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=edpm) Nov 23 05:13:23 localhost podman[338238]: 2025-11-23 10:13:23.686494502 +0000 UTC m=+0.179221498 container exec_died db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true) Nov 23 05:13:23 localhost systemd[1]: db0bc4e908adf26bc26ee1f2cab13e424784fe6e50a0d382f1e52bd6b39f9c44.service: Deactivated successfully. Nov 23 05:13:24 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.111 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.215 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.473 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.474 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquired lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.474 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.475 281956 DEBUG nova.objects.instance [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 355032bc-9946-4f6d-817c-2bfc8694d41d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 23 05:13:25 localhost nova_compute[281952]: 2025-11-23 10:13:25.998 281956 DEBUG nova.network.neutron [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updating instance_info_cache with network_info: [{"id": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "address": "fa:16:3e:cf:aa:3b", "network": {"id": "bcac49fc-c589-475a-91a8-00a0ba9c2b33", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.77", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "1915d3e5d4254231a0517e2dcf35848f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3912d14-a3", "ovs_interfaceid": "d3912d14-a3e0-4df9-b811-f3bd90f44559", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 23 05:13:26 localhost nova_compute[281952]: 2025-11-23 10:13:26.012 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Releasing lock "refresh_cache-355032bc-9946-4f6d-817c-2bfc8694d41d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 23 05:13:26 localhost nova_compute[281952]: 2025-11-23 10:13:26.013 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] [instance: 355032bc-9946-4f6d-817c-2bfc8694d41d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 23 05:13:27 localhost nova_compute[281952]: 2025-11-23 10:13:27.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:28 localhost nova_compute[281952]: 2025-11-23 10:13:28.213 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:28 localhost nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:28 localhost nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:28 localhost nova_compute[281952]: 2025-11-23 10:13:28.214 281956 DEBUG nova.compute.manager [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.237 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.238 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Auditing locally available compute resources for np0005532585.localdomain (node: np0005532585.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.239 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:13:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 23 05:13:29 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:13:29 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1995420351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.657 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.755 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.755 281956 DEBUG nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.956 281956 WARNING nova.virt.libvirt.driver [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.958 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Hypervisor/Node resource view: name=np0005532585.localdomain free_ram=11027MB free_disk=41.837013244628906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.958 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 23 05:13:29 localhost nova_compute[281952]: 2025-11-23 10:13:29.959 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 23 05:13:29 localhost openstack_network_exporter[242668]: ERROR 10:13:29 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 23 05:13:29 localhost openstack_network_exporter[242668]: ERROR 10:13:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:13:29 localhost openstack_network_exporter[242668]: ERROR 10:13:29 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 23 05:13:29 localhost openstack_network_exporter[242668]: ERROR 10:13:29 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 23 05:13:29 localhost openstack_network_exporter[242668]: Nov 23 05:13:29 localhost openstack_network_exporter[242668]: ERROR 10:13:29 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 23 05:13:29 localhost openstack_network_exporter[242668]: Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.115 281956 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.144 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Instance 355032bc-9946-4f6d-817c-2bfc8694d41d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.145 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.145 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Final resource view: name=np0005532585.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.301 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 23 05:13:30 localhost ceph-mon[300199]: mon.np0005532585@1(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 23 05:13:30 localhost ceph-mon[300199]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/984744193' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.759 281956 DEBUG oslo_concurrency.processutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.764 281956 DEBUG nova.compute.provider_tree [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed in ProviderTree for provider: dae70d62-10f4-474c-9782-8c926a3641d5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.801 281956 DEBUG nova.scheduler.client.report [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Inventory has not changed for provider dae70d62-10f4-474c-9782-8c926a3641d5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.803 281956 DEBUG nova.compute.resource_tracker [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Compute_service record updated for np0005532585.localdomain:np0005532585.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.804 281956 DEBUG oslo_concurrency.lockutils [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.804 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:30 localhost nova_compute[281952]: 2025-11-23 10:13:30.831 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:31 localhost nova_compute[281952]: 2025-11-23 10:13:31.840 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:31 localhost nova_compute[281952]: 2025-11-23 10:13:31.841 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:33 localhost nova_compute[281952]: 2025-11-23 10:13:33.214 281956 DEBUG oslo_service.periodic_task [None req-0d6127c7-195d-498e-a265-efe2de68ae5d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 23 05:13:33 localhost sshd[338320]: main: sshd: ssh-rsa algorithm is disabled Nov 23 05:13:33 localhost systemd-logind[761]: New session 75 of user zuul.